We’ve been covering the topic of misinformation in the context of brand safety since the 2016 elections south of the border. For over two years, political misinformation has been exacerbated with health and safety topics related to Covid-19 creating some discomfort for advertisers as they continue to tighten the reigns on brand safety thresholds in times of uncertainty. While the industry is running on over-time to address privacy and other security issues, platforms have been innovating against misinformation.
We caught up with Andrew Peterson, Head of YouTube Canada, to get a better understanding of the situation as well as the work being done to combat misinformation on one of the world’s largest user-generated content platforms.
IAB Canada (IABC): Can you describe the challenge at hand?
Andrew Peterson (AP): YouTube’s mission is to give everyone a voice and show them the world. Every day, creators from Canada and around the globe are using YouTube to teach new skills, share stories, and build communities and businesses. In fact, new research from Oxford Economics calculated that YouTube’s creative ecosystem directly contributed $1.1B CAD to Canada’s GDP in 2021, and supported 34,600 jobs in Canada. Advertisers big and small are also finding new audiences and growing their businesses on the platform.
The success of creators and advertisers is possible in part due to the incredible scale that YouTube provides. Over two billion logged in users watch YouTube monthly to catch up with their favourite creators, be entertained or learn something new. Operating at this level of scale means that YouTube has become a reflection of the world around us, including our greatest societal accomplishments and challenges.
Our unique business model only works when our viewers, creators and advertisers have confidence that we are living up to our responsibility as a business. As misinformation moves from marginal to mainstream, we’ve made stopping the spread of misinformation one of our deepest commitments.
IABC: What kind of solutions are being implemented?
AP: Safety – for advertisers, creators, and users – is at the core of everything we do. We’ve always had robust guidelines to govern the platform to ensure that we’re keeping our ecosystem safe.
Over the past few years, we’ve been investing in the policies, resources and products needed to live up to our responsibility commitments, and protect the YouTube community from harmful content. This work has focused on four pillars: removing violative content, raising up authoritative content, reducing the spread of borderline content and rewarding trusted creators. Thanks to these investments, videos that violate our policies are removed faster than ever and users are seeing less borderline content and harmful misinformation. As we do this, we’re partnering closely with lawmakers and civil society around the globe to limit the spread of violent extremist content online.
This work is important to us, which is why we release a quarterly Community Guidelines Enforcement Report, which provides data on the flags YouTube receives and how we enforce our policies.
We’re proud of the work we’ve done to date, but there is always more work to do here. As the digital world evolves, we evolve with it and we continue to invest in our policies, teams of experts and enforcement technology to stay ahead of potential threats.
IABC: How comfortable should advertisers feel about leveraging platforms like YouTube to connect with audiences without the threat of supporting misinformation?
AP: Since advertising is at the core of creators’ revenue, it is vital that advertisers have faith in our systems and feel comfortable with where their ads appear. We understand that advertisers do not want their brands associated with problematic content and actors, and when advertisers lack trust in our systems, they scale back their spend on YouTube. This affects the entire ecosystem.
That’s why keeping the platform safe is not only the right thing to do, but it’s also good for business. By partnering with advertisers to address their feedback, we’re now at least 99% effective at assuring brand safety for advertisers. As a result, YouTube was the first digital platform to be accredited for content level brand safety by the Media Rating Council, a distinction we received for a second year in 2022. YouTube was also one of the founding members of the Global Alliance for Responsible Media (GARM), a multi-stakeholder initiative to improve digital and brand safety with advertisers. As part of this initiative, we’ve helped establish a set of industry standards to define content not suitable for advertising.
We’ve worked hard to build a strong foundation of trust with our Canadian advertising partners. Again, there is always more work to do, but we share the same goal as advertisers to protect our community and continue to grow and support the Canadian creator economy.
IABC: What can the industry at large do?
AP: Working collaboratively with the industry and civil society has been instrumental in helping to keep our platform safe. We’ll continue to work collaboratively with partners to mitigate the spread of harmful content, reducing borderline content, and raising up authoritative information.