Insights from GARM’s Aggregated Measurement Report Volume 4

Insights from GARM’s Aggregated Measurement Report Volume 4

4 minute read

GARM’s focused view on social media content enforcement data shows increased action on war and weaponry with the ongoing War on Ukraine

Article details

  • Author:Rob Rakowitz
    Initiative Lead, Global Alliance for Responsible Media
Opinions
9 November 2022

We launched the Aggregated Measurement Report with key industry stakeholders in mind – CMOs, media leaders and brand safety experts, to give a consolidated view on industry progress and a series of insights to help the industry identify trends.

Our latest volume reflects on real-world events in way we’ve not seen previously. Like with all of the previous Aggregated Measurement Report volumes, this is a backward-looking analysis. Volume 4 focuses on a year’s worth of data submissions spanning the first quarter of 2021 to the second quarter of 2022 (however, Snap and Twitter’s data is older due to their later release periods).

We recognize that the context for this report’s release is increasingly sensitive, whether real world events or the recent acquisition of Twitter. Many of the platforms in GARM play critical roles in news-based events, and their ability to moderate and monetize content is tested by these dynamic events. It is likely that these events will impact the next volume of our report. We should expect changes in platforms, products, and events to continue to shape the trends and explain some of the ‘why’ and ‘how’ behind the data we report.

As we navigate this fourth volume, what did we learn? 

Enforcements start to reflect a regional conflict

In this volume of the Aggregated Measurement Report, we see a 39% rise in the volume of violative content removed from two harmful content categories linked to the War on Ukraine – Death, Injury, Military Conflict and Arms & Ammunition. If we look only at content removals for Death, Injury and Military Conflict, we see that this enforcement area has significantly increased certain platforms – for Pinterest (+798%), Twitter (+23%) and Facebook (+21%).

Adult & Explicit Sexual Conduct still dominates

While Death, Injury, Military Conflict and Arms & Ammunition were the fastest-growing categories during Q1 and Q2 2022, the biggest categories for enforcement identified in the report remain Spam & Malware and Adult & Explicit Sexual Content. In particular, the Adult & Explicit category remains the top removal reason for content removals for Pinterest and Snap, and the top removal reason for account removals for Twitter, at 45% of total account removals.

Increased enforcement on Illegal Drugs, Tobacco, e-cigarettes, vaping and alcohol

Other areas show increased enforcement include Illegal Drugs, Tobacco, e-cigarettes, Vaping and Alcohol. Half of the platforms participating in the Aggregated Measurement Report have seen an increase in content removals period versus prior period;  the highest increase was reported by Twitter (+36%) and Instagram (+23%). Improved policies and technologies are driving increases in enforcement in this harmful content category.

Increased participation and measurement best practices

We should also reflect on the change we drive in this common reporting framework. Since the first report was published in April 2021, the number of platforms covered has risen from 7 to 8 platforms. Additionally, of the data submitted the report has seen an increased in Authorized Metrics, representing measurement best practices increase from 48% to 68% of submissions.

As we go forward what should we look for?

More consistency in measuring misinformation

We just released our definition on Misinformation in GARM in June, and we see a few platforms incorporating it formally, while most are reporting on enforcements in an ad-hoc manner. As time goes on and as complementary efforts like the European Commission Code of Practice on Disinformation, which involves GARM and several member organizations, we should see more formal reporting on this new harmful content category.

More transparency on local measurement

Global transparency reporting is a significant endeavor. Two of the principle authorized metrics in the report on consumer safety and advertiser safety largely rely on a global sample measurement methodology. Under these global metrics, samples and forecasts are made for local markets and languages. We will work with platforms and local markets to disclose these local samples, and also explore ways to report on key actions at the local level.

More analyses

Our next volume will have two years of data. This will give us a unique opportunity to assess progress, seasonality, and also introduce some new analyses and visualizations. With this we anticipate new levels of insights for us to understand progress on this important issue.

Developing the Aggregated Measurement Report is a significant undertaking in collecting data, commentaries and developing insights. We would like to thank our partners YouTube, Meta, Twitter, Pinterest, Snap, Twitch, and TikTok for their participation. We would also like to thank leaders from GroupM and ISBA for the continued support in helping drive the interpretation further.

Article details

  • Author:Rob Rakowitz
    Initiative Lead, Global Alliance for Responsible Media
Opinions
9 November 2022

Contact us

For more information or questions, please contact us