The change is designed to give YouTube more time to learn about a channel before it begins to sell advertising on it and share that revenue with the channel's owner. Theoretically, that will make it easier for YouTube to catch bad actors who aren't there to build up an audience but manipulate other users or spread false information.
A significant number of channels are expected to be impacted by the change. But in the blog post, YouTube executives point out that 99 percent of those channels were "making less than $100 per year in the last year."
Further, YouTube is announcing that it will introduce new procedures to vet the videos that are part of its premium advertising tier, Google Preferred. Under the new plan, previously reported by Bloomberg, Google Preferred ads will now only run on videos that have been human verified as brand safe. YouTube says this verification process will be completed by mid-February in the U.S. and end of March for the rest of the world.
YouTube is also revamping the tools that it gives advertisers to control the placement of their content. A new three-tiered system will be put in place to allow brands to provide feedback on the placement of their ads.
Last year, several advertisers temporarily pulled their spots from YouTube after media reports revealed that the messages had run alongside inappropriate content, including a video by top YouTube PewDiePie that included an anti-Semitic joke. In the months that followed, YouTube faced similar scrutiny for allowing Russian propaganda to flood its platform and bad actors to create exploitative kids content.
In late December, CEO Susan Wojcicki acknowledged in a pair of blog posts that YouTube needed to make changes to how it monitored content on its platform. She announced that YouTube's trust and safety teams would grow to more than 10,000 employees in 2018. She also promised that YouTube would announce additional changes in the coming weeks.
Then, on Dec. 31, Logan Paul came under fire for posting a video in a Japanese forest that included the image of a man who had committed suicide. YouTube responded by removing Paul from Google Preferred and putting all of his original projects on hold.
In their blog post, Mohan and Kyncl acknowledged that the changes announced Tuesday won't necessarily create a solution for incidents like the ones that PewDiePie and Paul created in 2017. "While this change will tackle the potential abuse of a large but disparate group of smaller channels, we also know that the bad action of a single, large channel can also have an impact on the community and how advertisers view YouTube," they wrote. "We'll be working to schedule conversations with our creators in the months ahead so we can hear your thoughts and ideas and what more we can do to tackle that challenge."
This article was originally published by The Hollywood Reporter.