WFA’s Global Alliance for Responsible Media (GARM) releases Volume Five of the Aggregated Measurement Report. As a part of GARM’s charter, GARM is committed to understanding how the ad selling industry voluntarily improves monetization safety.
Share this post
GARM’s Measurement and Oversight Working Group has delivered the fifth iteration of the Aggregated Measurement Report, which answers four questions on how well platforms are enforcing their policies in the context of the brand safety floor:
- How safe is the platform for consumers?
- How safe is the platform for advertisers?
- How effective is the platform enforcing its safety policy?
- How responsive is the platform at correcting mistakes?
Since the report’s first release, two additional platform participants have joined (Twitch and LinkedIn). Today’s report includes self-reported data from Facebook, Instagram, Pinterest, Snap, TikTok, Twitter, YouTube, Twitch and LinkedIn. Authorized metrics have risen from 48% of submissions to 68% and misinformation data has been included.
Consumer safety is improving when we look at trends over time
When looking at user safety metrics – typically expressed as Prevalence or Violative View Rates – and trends over time, highlights from the data show decreases in the prevelance of violative content as measured by these metrics.
Share of enforcements analysis shows topics related to individual safety, group hate and graphic violence emerge as high-volume areas
The report highlights the actions taken across platforms in the areas of both content removals and account suspensions. The proportion of enforcements made by GARM content category are the following:
- Crime & Harmful Acts to Individuals and Society
- Human Right Violations Death, Injury or Military Conflict
- Online Piracy
- Hate Speech & Acts of Aggression
- Obscenity And Profanity, Including Language, Gestures, And Explicitly Gory, Graphic or Repulsive ContentIntended to Shock and Disgust
Enforcement data shows an increase in content removals and a decrease in account removals
Analyzing platform enforcement trends showed an increase in content removal by 13%, whereas account removals were down by 16%. The significant increases in content removals are linked to individually-defined and driven platform enforcement efforts across various policies, with a trend towards policies mapped to GARM's “Crime & Harmful Acts to Individuals & Society, Human Rights Violations" category.
Looking to volume six, our next report will have three years of data. This will give us a unique opportunity to assess progress over time and gain a deeper understanding of emerging data trends. Further, we are pleased to share that Spotify will be joining as GARM’s eleventh platform participant.