The GARM Working Charter: Priorities for Uncommon Collaboration
This document is our cross-industry industry alliance’s working charter. It represents our current and shared agreements on where we will focus our multilateral work, and as such the GARM reserves the right to update the document to reflect changes in priorities. This document is correct and approved as of 17 January 2020.
Share this post
The rapid growth of digital communications and commerce has connected the world in unprecedented ways. Many of these connections come from advertising-supported platforms which provide immense utility to the billions of people who use them. But as the size of the audiences, and the volume of advertising and commerce on these platforms has grown, in turn, bad actors have been attracted to the environments.
These individuals or groups act as advocates for harmful behavior online, spreading content glorifying harmful behaviors, and at times actively profiting from it. This dynamic is a threat is too costly for all; people, brands, agencies and media platforms.
Many industries and organizations have robust responsibility and safety programs around how they source, create and distribute products – we must extend this same sensibility into advertising and media given its impact on consumers and society.
- taking actions which will better protect everyone (children in particular) online; and
- working towards a media environment where hate speech, bullying and disinformation is challenged; and
- taking steps to ensure personal data is protected and used responsibly when given
We are committed to removing economic incentive and reducing operational loopholes that are exploited by bad actors.
We have spent considerable time immersing ourselves with the challenge: speaking to technologists, researchers, platforms, agencies, and advertisers. The opportunity for the GARM is clear; we will make it easier for all participants to uphold responsible media operations by creating norms that face into technology development, categorization and monetization of content, and incident measurement. – We believe this will ultimately improve the consumer and brand experience, and thereby reduce societal risk. We will enable this by our three strategic focus areas:
- Establishing shared, universal safety standards for advertising & media
- Improving and creating common brand safety tools across the industry
- Driving mutual accountability and independent verification and oversight
The efforts of GARM will now seek to enable these three strategic focus areas through the following working solutions that will be prioritized via the GARM’s roadmap. This collection of endeavors reflects our current consideration set of where to drive our uncommon collaboration. The GARM will report on how we prioritize and deliver across these areas in our journey together:
Establishing shared, universal safety standards for advertising & media
- Advance shared language and standards for advertising & media: Advance the adoption of an industry wide common definition of categories of harmful, non-brand safe and suitable content to improve the effectiveness of consumer and brand safety across platforms, geographies and formats.
- Establish advertiser-facing visible rating systems to improve trust and governance: Verify that each platform has an easy to access and understand risk framework for advertisers and agencies that works across advertising products, and content.
- Educate operators and stakeholders: Commit to scaling easy-to-access education programs to ensure that all parties are aware of existing tools and best practices to advance online safety.
Improving and creating common brand safety tools across the industry
- Create accessible controls for monetization: Ensure that platforms have accessible controls that can quickly and effectively remove harmful content from monetization pools and campaigns based on guidelines.
- Improve operations to better connect risk settings: Ensure that consumer and brand safety protection practices keep up with the speed of changing community standards by creating tools that better connect with advertiser and agency risk settings.
- Make conversations safer: Ensure that safety tools and processes can address the safety and suitability concerns in comments when and where appropriate.
- Create new anticipatory tools together: Support individual platforms and cross-platform collaboration to create anticipatory tools which contain and remove harmful content and bad actors.
Driving mutual accountability
- Advance how impact is measured: Explore ways to advance the transparency and independent partnership in how industry partners report on their steps to categorize, size and remove harmful content and its creators.
- Clarify platform safe content policies: Create transparency in how individual platforms make their content with and monetization decisions and detail their remediation policies to affected parties.