First-ever report creates unprecedented transparency across the industry and a new benchmark for charting progress on removing harmful content from ad-supported media
Share this post
The Global Alliance for Responsible Media (GARM) has today launched its first report tracking performance on brand safety across seven platforms, including Facebook, Instagram, Twitter and YouTube, as the next step in its mission to improve the safety, trustworthiness, and sustainability of media.
By aggregating existing platform transparency reports and adding in policy-level granularity, the new document creates a common framework that enables advertisers to assess progress against brand safety for each platform member of GARM. The new framework also drives simplicity, focus and highlights the use of best practice methodology.
The GARM Aggregated Measurement Report is based around four key questions marketers can use to assess progress over time. The report is consistent with the common framework used to define harmful content not suitable for advertising and introduces aggregated reporting. The new frameworks of common definitions and aggregated reporting deliver consistency in well-established practices at the same time as advancing best practices into industry standards.
Ultimately, the report provides a common and focused framework for advertising industry stakeholders to make more informed decisions about their advertising investment.
Highlights from the latest data show that more than eight-in-ten of the 3.3 billion pieces of content removed across the platforms participating in the report is from three leading categories – Spam, Adult & Explicit Content, and Hate Speech & Acts of Aggression.
The data also illuminates a growth in action taken on Hate Speech & Acts of Aggression across platforms. GARM platforms have reported increases in activity and its impact with significant progress by YouTube in the number of account removals, Facebook in the reduction of prevalence, and Twitter in the removal of pieces of content.
These initial improvements have occurred amid an increased reliance on automated content moderation to help manage blocking and reinstatements due to COVID-19 disruptions that resulted in moderation teams working with limited capacity.
GARM is a cross-industry initiative founded and led by the World Federation of Advertisers (WFA) and supported by other trade bodies, including the Association of National Advertisers (ANA), Incorporated Society of British Advertisers (ISBA) and the American Association of Advertising Agencies (4A’s).
“We have built on our agreed definitions to produce a detailed database of the progress that’s being made on reducing harmful content and the potential for monetization across the digital platforms. The collaboration between advertisers, agencies and platforms has been very constructive and we now have common ground to drive even greater progress for the benefit of society, marketers and the long-term health of the digital ecosystem,” said Stephan Loerke, CEO of the WFA.
The report follows nine months of collaborative workshops between major advertisers, agencies and key global platforms working together as one of GARM’s Working Groups, bringing together for the first-time data in a single, agreed location around four core questions and eight authorised metrics that have been agreed as critical to tracking progress on brand safety.
The Aggregated Measurement Report provides a simple and transparent framework based around four core questions that advertisers can use to understand how well the platforms are enforcing their policies in the context of the brand safety floor:
- How safe is the platform for consumers? The prevalence of harmful content will be reported as the number of views of harmful content as a percentage of all views of content.
- How safe is the platform for advertisers? The incidence of advertising appearing in the context of harmful content will be reported as the number of ad impressions on harmful content as a percentage of all ad impressions. For newsfeed environments, the overall consumer prevalence measure above will be reported.
- How effective is the platform enforcing its safety policy? This will be reported as the total number of pieces of harmful content removed and the number of times it has been viewed.
- How responsive is the platform at correcting mistakes? This will be reported as the total number of appeals made by users and the number of reinstatements made by platforms.
Independent oversight and measurement is critical to the GARM initiative, helping create accountability on the challenge of harmful content. It enables each member to ask how are we progressing collectively, how are we progressing individually, how are we tackling each of these topic areas?
Today’s report includes self-reported data from Facebook, Instagram, Pinterest, Snap, TikTok, Twitter and YouTube. Numbers are self-reported by platforms. The full data set can be downloaded here. Twitch, which only joined GARM in March, will join the reporting process for the next report, due later this year.
GARM Working Groups continue to work on other areas of focus including better adjacency controls for brands and hopes to announce further initiatives later in the year.
Raja Rajamannar, Chief Marketing and Communications Officer, Mastercard and WFA President: “This report is great progress for our joint efforts, bringing together consistent and reliable data that marketers can depend on. It establishes common and collective benchmarks that reinforce our goals and help brand leaders, organizations and agencies make sure we keep media environments safe and secure.”
Marc Pritchard, P&G Chief Brand Officer: “There’s no place for harmful online content in media that’s monetized by advertising, and we need to understand the size of the problem and track progress over time. The GARM Aggregated Measurement report is an important step forward in helping brands advertise in safe and suitable places—a critical element for consumer trust.
Conny Braams, Chief Digital and Marketing Officer, Unilever said: “When we launched Unilever’s Media Responsibility Framework, we called for collective action to rebuild trust within the digital ecosystem, including the need for consistent metrics and measurement to evaluate and eliminate harmful content across platforms. The launch of GARM's Aggregated Measurement Report signals the accelerated progress and collective commitment across the industry to address the growing challenges. While we don’t yet have a perfect solution, it's the first time we can quantify and compare the harmful content that exists across platforms, to learn collectively and act. This is a welcome and important step forward in our mission to protect people and our brands online.”
Carolyn Everson, VP, Global Business Group, Facebook: “In 2018, we started Facebook’s transparency report to help people understand how we’re doing at enforcing our policies. However, we recognize marketers need to be able to have a single report to understand the industry’s progress through the lens of a common language and framework. The GARM Aggregated Measurement Report is a big step forward to help simplify these reports for marketers.” —
Debbie Weinstein, Vice President, Global Solutions, YouTube: “The Aggregated Measurement Report is a great example of GARM delivering on its mission to bring the industry together to improve the safety, trust and sustainability of digital media. It is our hope that the report helps advertisers more easily assess the progress platforms like YouTube are making in this critical area.”
Sarah Personette, VP, Global Client Solutions, Twitter: "From its conception, GARM has fostered an open and honest exchange of ideas to solve critical problems that will help drive positive global impact. Twitter believes in the power of a public, and open conversation, and our ongoing work with GARM further reinforces our enduring commitment to provide transparency into the work we are doing to support the health of the public conversation."
Phil Smith, Director General, ISBA: “We’re delighted by the publication of this first GARM Aggregated Measurement Report. Its common framework will allow advertisers better to assess the progress made by platforms in ensuring they provide safe environments for users and for advertisers. The report is the product of unparalleled collaboration across the industry and is a vital step forward in GARM’s mission to improve the safety, trustworthiness, and sustainability of media. ISBA is proud to be an active member of GARM’s Steering Group, as we strive to create a trusted and accountable advertising environment.”
Notes for editors
The Global Alliance for Responsible Media (GARM) is an initiative led by WFA and brings together advertisers, agencies, media companies, platforms and industry organisations to improve digital safety.
Members of the Global Alliance for Responsible Media recognise the role that advertisers can play in collectively pushing to improve the safety of online environments. Together, they are collaborating with publishers and platforms to do more to address harmful and misleading media environments; and to develop and deliver against a concrete set of actions, processes and protocols for protecting brands. More at www.wfanet.org/garm