The Digital Services Act: changing the rules of the (advertising) game?

The Digital Services Act: changing the rules of the (advertising) game?

4 minute read

At the end of August, big tech platforms were called to lay their cards on the table. This was the deadline by which they needed to comply with the EU’s Digital Services Act (DSA), a law aimed at protecting consumers from illegal content and regulating advertising online.

Article details

  • Junior Digital Policy Manager
Opinions
23 October 2023
European Commission
At WFA, we will continue to collaborate with platforms to drive clarity around the implementation of the DSA and monitor regulatory developments at global level.

However, the new rules are already proving to be complex and contested. The legal text leaves much room for interpretation, and some platforms, such as Amazon, have already objected to their designation as ‘Very Large Online Platforms’ under the DSA.

While enforcement will be crucial to clarify how the rules should be interpreted, GDPR aficionados know how long and painful that might be.

The rules are already being tested, particularly in light of the Israel-Hamas war and the related disinformation and violent content that has spread across social media platforms.

Questions are being raised about the impact of the new rules. So what signs are there that platforms really changed the way they play?

Real-time ad transparency

Consumers demand more transparency over the ads they see and how their personal data is used. This is why the DSA requires all online platforms to clearly label advertisements as well as to provide ‘meaningful’ information about the ‘main parameters’ used to target ads, and how users can change these parameters.

Meta’s updated ‘why am I seeing this ad’ function now allows users to see that they were targeted with a sports shoe ad because they followed a page about football. TikTok on the other hand, tells users they are seeing certain ads on the basis of “activities on and off” the platform, without specifying (for now) which activities.

Online platforms are interpreting these transparency obligations differently. It remains to be seen whether the measures will improve consumer trust in online advertising or simply create more confusion.

A one-stop-shop for ads transparency

The DSA asks so-called ‘Very Large Online Platforms’ to create a public database of all ads displayed on their platforms, for one year after the ad’s final publication. The database will include information about who paid for the ad, who the intended audience was, for how long the ad was displayed, and the number of ad impressions per market.

This aims at helping researchers and regulators identify risks in online advertising, such as disinformation campaigns or discriminatory targeting practices.

From an advertiser perspective, the database creates potential challenges. Making the data publicly available could enable competitors to extrapolate information about brand marketing strategies and advertising spend. WFA has raised these concerns with regulators and platforms.

Restrictions on targeting practices

A majority of consumers remain critical of practices that rely on personal data to target ads. Therefore, the DSA also bans profiling on the basis of sensitive personal data and the personal data of minors.

TikTok has since introduced a ban on personalised advertising to 13-17-year-olds in Europe. Meta says advertisers on its platform will still be able to target minors based on age and location. Google claims it already has policies banning the use of sensitive information on health, race, religion or sexual orientation to target ads.

Questions remain about whether each platform’s vision of compliance will satisfy regulatory and consumer demands.

Clear content moderation

Finally, the DSA aims to drive accountability for and transparency around platforms’ content moderation practices and assess the overall risks of platforms for users.

This includes an obligation to publish content moderation reports, indicating the effectiveness of action taken to remove illegal content. WFA believes this could help brands assess the impact of policies across platforms, and complement reports on content monetisation efforts developed by the Global Alliance for Responsible Media (GARM), our WFA led initiative aimed at preventing the monetisation of harmful content via advertising.

Large online platforms will also have to conduct regular risk assessments. We believe this could help advertisers judge the level of safety of online platforms and their efforts to keep users safe.

Although these resources will help brands make more informed media decisions, the lack of standardisation in reports and assessments could render comparison across platforms a challenge.

Playing the long game

Will the DSA keep users safe online and in the physical world?

Many people are now watching closely to see how seriously the law will be implemented and enforced, and to identify shortcomings. In parallel, the UK is moving towards enforcing its own set of rules under the ‘Online Safety Bill’. Regulators in many countries are mulling similar legislation.

Amid this uncertainty, we will continue to collaborate with platforms to drive clarity around the implementation of the DSA and monitor regulatory developments at global level.

Despite all the questions about whether the DSA will deliver on its online safety, transparency, and content moderation objectives, one thing seems clear:

It’s game on.

Article details

  • Junior Digital Policy Manager
Opinions
23 October 2023