GARM announces guidelines on misinformation, standards on ad placements, and expansion to cover the metaverse
New adjacency standards were unveiled for Feed, Stories, In-stream Video, In-stream Audio and Display overlay; along with plans to ensure brand safety is built into new metaverse spaces prior to monetization
Share this post
The Global Alliance for Responsible Media announced a series of new guidelines as it marked its third anniversary at the Cannes Lions International Festival of Creativity.
The WFA-led coalition of multinational advertisers, agencies and platforms will be releasing guidelines on misinformation, new standards on ad placements, and an outline of first steps to make the metaverse safe for advertising. GARM is also working with the WFA’s National Associations Council to ensure the whole framework, including the new misinformation guidelines, is applied at a local level through national advertiser associations.
“While digital media owners tend to be global players, it is critical that we also take these vital controls down to a local market level. Brand safety standards need to be applied by all digital media owners so that advertisers can be sure that their messages are not funding bad actors or appearing against content that damages their standing in the eyes of their customers,” said Stephan Loerke, CEO of WFA.
The new misinformation guidelines, introduced in the wake of the COVID-19 pandemic and the Ukraine war, will form an essential addition to the existing GARM Brand Safety Floor and Suitability Framework. They have been designed to provide a structure for demonetizing misinformation, and to build on the success that the framework has already delivered in changing how brands set strategies, how media agencies build media schedules and how platforms and ad tech partners structure their tools.
In the GARM tradition of “uncommon collaboration”, the misinformation guidelines have been developed in coordination with the European Commission and in consultation with NGO partners, such as Consumers International, Reporters without Borders, ADL and NAACP.
“GARM has achieved much in short space of time – more aligned definitions of harmful content, enhanced measures, and the introduction of adjacency controls. But more still needs to be done. Broadening definitions to include misinformation, introducing adjacency standards and a proactive approach to monetizing the Metaverse are important next steps in ensuring that our brands can safely reach the diverse consumers we serve,” said Marc Pritchard, Chief Brand Officer of Procter & Gamble.
Adjacency standards framework
As an additional effort, GARM is announcing a new Adjacency Standards Framework to ensure that media placement in safe but sensitive content is done in a more controlled way by advertisers, agencies and platforms alike. The new framework defines, for the first time, minimum standards and an approach for managing ad placement relative to sensitive content within News Feeds, Stories, In-stream Video, In-stream Audio and Display overlays.
“Building on the success of the GARM Brand Safety Floor and Suitability Framework, the Adjacency Standards Framework will provide advertisers and platforms with much-needed transparency together with a common language to better manage the proximity of advertising to sensitive content. This is a meaningful step for the industry moving from a lack of visibility, control and significant brand risk to a world where brands feel confident to invest in platforms that actively strive to provide a safe, transparent and effectively managed environment,” said Rankin Carroll, Global Chief Brand & Content Officer, Mars Wrigley.
The goal of these standards is to give ad buyers and ad sellers a common framework to better manage ad placements next to sensitive but suitable content around content such as death, injury or military conflict, allowing brands to restrict or allow where their messages appear. Further work on delivering similar standards for Livestream formats continues with the initial priority being to define a minimum safety standard for monetization, given abuse of the format in a recent mass shooting event.
Embedding brand safety in the metaverse
GARM is also starting work to help industry stakeholders better understand brand safety principles and requirements within new metaverse spaces. The goal is to help identify appropriate opportunities in these new environments that bring together content and behaviours.
“GARM has been set up to ensure that the business model for advertising doesn’t fund harm and we did this in a reactive way – after business practices took shape in digital social media. We must help the industry understand safety requirements before commercialization begins in the metaverse. We are being asked by our members to start on this journey as new spaces emerge. We must ensure that advertising is aligned with sustainable and responsible growth models,” said Rob Rakowitz, GARM Co-Founder and Initiative Lead.
These new initiatives follow the publication of Volume 3 of the Aggregated Measurement Report, which tracks the progress of all platform members of GARM in delivering against the organisation’s eight key metrics.
Published earlier this month, the report details measured improvements in performance by the major platforms, including YouTube’s MRC continued accreditation. Since the first report was published in April 2021, the number of platforms covered has risen from seven to eight, with the addition of Twitch. The report has led to an improvement in the application of measurement best practices with Authorized Metrics submissions increasing from 26 to 36 since launch.
The latest report illustrates continued improvement and areas of opportunity for further action, including:
- Enforcement in GARM categories Spam & Malware and Adult & Explicit Sexual Content continues to be the largest and most automated, given that these content types are most easily identifiable by technology;
- Highly nuanced content Crime & Harmful Acts to Individuals and Society is highly reliant on context and remains the most manual. This area will continue to require an understanding of behaviour and intent; and
- Areas such as Misinformation and Self-Harm are becoming priorities for enforcement and reporting given the highly sensitive nature of these types of content.
This year, GARM celebrated its third anniversary at Cannes Lions where it was launched with 16 founding partner companies in 2019. The organisation now has 122 members, including 61 advertisers, 6 agency holding companies, 11 media platforms, 9 ad tech companies and 35 industry associations.
“As a founding member of the Global Alliance for Responsible Media, we are delighted to see the progress that’s been made over the last three years. When we launched GARM, we didn’t even have common definitions for harmful content! Since then, we have agreed common definitions, a shared framework to assess progress, and we’re now introducing new guidelines on misinformation and adjacency standards.” – Luis Di Como, EVP Global Media, Unilever
“Chief among the impressive list of GARM accomplishments is the publishing of the Misinformation category to the Brand Suitability Framework. GroupM looks forward to the immediate impact this step will have in driving increased transparency, verification, and mitigation of Misinformation, which has long been a major nemesis of brand safe, credible media environments.” – Joe Barone, Brand Safety Lead - Americas, GroupM
“Meeting the challenges posed by society and sustainability are key goals of the ANA’s 12-point Growth Agenda. The GARM guidelines provide brands with essential recommendations on how to achieve those objectives while also effectively addressing the double-edged issues of brand safety and misinformation. These are important new tools that every marketer should adopt.” – Bob Liodice, CEO, ANA
“The Aggregated Measurement Report provides brand owners with critical assurance on the safety of platforms for users. It’s encouraging that platforms continue to step up their transparency reporting efforts and that the leaders are now holding themselves accountable through independent audit.” – Phil Smith, Director General, ISBA
“The 4A’s and its APB (Advertiser Protection Bureau) are proud to continue supporting the expansion of GARM into critical areas including live-streaming and the metaverse. We applaud the global impact and scale of partners working together to address media responsibility in an increasingly fragmented and complex ecosystem.” – Marla Kaplowitz, CEO, 4A’s
"Since its founding in 2019, the Global Alliance for Responsible Media has demonstrated that industry can come together to meaningfully address the challenge of harmful content online. Meta’s commitment to safety and partnering across the industry is unwavering — from our significant investments in safety, platform controls and third-party verification. We look forward to continuing this progress in the years to come.” – Nicola Mendelsohn, VP Global Business Group, Meta
“Responsibility is our top priority. Through our longstanding partnership with GARM, we’ve made significant progress towards building a more sustainable and healthy digital ecosystem for everyone. We commend GARM on these latest milestones and continued commitment to develop industry-wide standards to keep consumers, creators and advertisers safe online” – Debbie Weinstein, Vice President, YouTube & Video Global Solutions, Google
"Keeping people and brands safe online continues to be of critical importance. We are proud to be working with the GARM community to help the ad industry make meaningful contributions toward those ends. This latest step to combat the monetization of misinformation and create common standards and definitions related to brand suitability will be critical in creating a safe environment for everyone." – Sarah Personette, Chief Customer Officer, Twitter