LOS ANGELES - YouTube, under fire for facilitating the spread of conspiracy theories and other misinformation, said it will no longer serve ads on channels that espouse anti-vaccination rhetoric. The Google-owned video giant cited its advertising policy that bans "dangerous and harmful" content from eligibility in its monetization program.
"We have strict policies that govern what videos we allow ads to appear on, and videos that promote anti-vaccination content are a violation of those policies," YouTube said in a statement. "We enforce these policies vigorously, and if we find a video that violates them, we immediately take action and remove ads."
YouTube didn't specify which anti-vax channels -- or how many -- were subject to the demonetization action. According to BuzzFeed, which first reported the news, affected channels included VAXXED TV, LarryCook333 and iHealthTube.
The issue hits two ongoing pain points for YouTube: Its role in amplifying conspiracy-mongers and concerns among marketers about the "brand safety" of the platform. In a separate controversy, marketers including AT&T, Disney, Hasbro and Epic Games this past week suspended all ad spending on YouTube after a secret ring of child predators was discovered to be posting coded sexual comments on videos of young girls.
Anti-vaccination proponents who falsely suggest that vaccines are somehow unsafe for kids remain rife on YouTube and other platforms. There's overwhelming scientific and medical evidence that childhood vaccinations are both safe and effective at curbing disease outbreaks.
In January, YouTube announced a change to its content-recommendations system aimed at reducing the recommendation of "borderline content and content that could misinform users in harmful ways." That includes videos promoting bogus miracle cures for serious illnesses, claiming the Earth is flat, or making "blatantly false claims" about historical events like the 9/11 terrorist attacks, according to YouTube. The algorithm changes initially will affect recommendations for a very small set of videos in the U.S., with plans to expand internationally. While YouTube repeatedly claims that objectionable material like conspiracy videos represent well under 1% of its overall content, that would still represent millions of videos.
On another front, YouTube last year began displaying links to Wikipedia articles on conspiracy videos. For example, many anti-vaccination videos on YouTube now point viewers to Wikipedia's "Vaccine hesitancy" article.
Of course, YouTube isn't the only internet company accused of providing a megaphone for crackpot ideas. Separately, Pinterest recently implemented a change to simply block all vaccination-related searches on its image and video-sharing platform, as first reported by the Wall Street Journal.
Last week, Rep. Adam Schiff (D-Calif.), sent a letter to Google and Facebook requesting details on how the companies planned to combat the spread of information discouraging parents from vaccinating their children. "I was pleased to see YouTube's recent announcement that it will no longer recommend videos that violate its community guidelines, such as conspiracy theories or medically inaccurate videos, and encourage further action to be taken related to vaccine misinformation," he said in a statement.
YouTube's decision to pull ads from channels purveying anti-vaccination conspiracies and false information came after BuzzFeed News reported that the platform's video-recommendation algorithm frequently pulled up anti-vax videos after users conducted vaccination-related searches.
Advertisers whose spots had appeared in anti-vaccination YouTube videos -- apparently without their knowledge -- include Vitacost, Retail Me Not, Brilliant Earth, CWCBExpo, XTIVIA and SolarWinds, per the BuzzFeed report.