Picture: 123RF/GILC
Picture: 123RF/GILC

San Francisco — YouTube will ban videos that promote QAnon and other conspiracy theories, but only if they target specific people or groups, seeking to crack down on potentially dangerous misinformation after criticism the service helped these fringe movements expand.

The decision comes a week after Facebook said it would remove accounts associated with QAnon, a far-right movement that the FBI has reportedly labelled a domestic terrorism threat.

YouTube’s ban is an attempt to stamp out the conspiracy without hindering the huge volume of news and political commentary on its service. Rather than a blanket prohibition of QAnon videos or accounts, YouTube is expanding its hate and harassment policies to include conspiracies that “justify real-world violence”, the company said on Thursday.

“Context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups, may stay up,” YouTube, a unit of Alphabet’s Google, wrote in a blog post.

Technology platforms have released a blitz of new rules to curb misinformation after mounting momentum for movements such as QAnon. Twitter recently said it would make it harder for people to find tweets supporting QAnon, while Etsy removed QAnon-related merchandise from its online marketplace.

Pressure for these companies to act has been building for months. YouTube already instituted a policy similar to Twitter’s, though it did not publicise it. Starting in 2019, the service began to treat QAnon videos as “borderline content”, meaning the clips are recommended and shown in search results less often. Views from recommendations on “prominent” QAnon videos have dropped 80% since then, the company said.

YouTube was a key driver of QAnon’s early popularity, according to Angelo Carusone, president and CEO of Media Matters for America, a non-profit group that analyses conservative misinformation.

A QAnon evangelist called PrayingMedic has almost 400,000 subscribers on his YouTube channel, for instance. And even after YouTube’s borderline content move last year, QAnon videos spread from the Google service to other sites. YouTube broadcasts about the conspiracy theory featured regularly in Facebook groups and pages, until Facebook’s recent ban. YouTube QAnon clips also continued to be shared on other niche services such as Parler.

Still, Carusone said YouTube’s efforts to slow the spread of the conspiracy theory have been relatively effective in recent months.

The tech platforms and QAnon supporters will now likely enter into a game of cat and mouse, where users come up with new hashtags and different claims to evade automated filters. QAnon followers have proven particularly adept at this, according to Carusone.

“There has never been a community where their participants are as adaptable,” he said.

A significant unanswered question is how well YouTube can identify videos designed to be less obvious upon initial inspection, Carusone added.

“It is very easy for them to identify explicitly identified QAnon content and accounts,” he said. “What they have not articulated is how well that can be applied to less-explicit accounts.”

Bloomberg

Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.