App icons for Signal, WhatsApp, and Telegram. Pictur:e Brent Lewin/Bloomberg
App icons for Signal, WhatsApp, and Telegram. Pictur:e Brent Lewin/Bloomberg

San Francisco — One of the earliest people to get Pfizer’s Covid-19 vaccine was a nurse in Tennessee, who fainted after getting the shot on live television in December. The incident sparked rumours that she had died and that the vaccine was a tool of genocide.

Five months later the nurse, who is not dead, continues to be bombarded by messages from strangers on social media. They send condolences to her family or demand details about the incident. Oddly, they often do so in German, Italian or Portuguese.

The international fixation on this case follows what is becoming a common pattern. US-based social media users begin spreading misleading or false information, which then moves to other countries, according to researchers studying the rumours. The US may not yet have figured out an efficient way to distribute shots to other countries, but it has become a major exporter of misinformation.

The US “dominates social media culture the way it dominates pop culture”, and not always to the world’s benefit, says Cameron Hickey, project director for algorithmic transparency at the National Conference on Citizenship, a civic organisation based in Washington. He’s been researching misinformation in Spanish-speaking Facebook communities and says most of what circulates is “a carbon copy of rumours that we first see in English”.

Major social media companies are constantly criticised for failing to crack down on misinformation in the US, but advocates complain their performance is even worse when dealing with content in languages other than English. The machine-learning algorithms that companies such as Facebook and Alphabet ’s YouTube use for content moderation were built in English first and aren’t as effective in other languages, Hickey says.

Facebook removed US-made videos of women who appeared to be shaking in response to the Covid-19 vaccine, but the same content with Arabic subtitles continued to spread without a fact check

Advocates have also criticised the tech companies for not sufficiently staffing content moderation teams outside the US.

Facebook has 80 fact-checking partner organisations working in 60 languages, according to Kevin McAlister, a spokesperson. “We use machine-learning models in more than a dozen languages to send potentially violating content about Covid-19 and vaccines to reviewers — including native speakers — who remove content that is in violation,” he says.

The company doesn’t share the number of content reviewers in a given country, saying it wouldn’t reflect its efforts. Elena Hernandez, a YouTube spokesperson, says its “approach to addressing misinformation is global and applies across all languages”.

Content often continues to go viral abroad even after it’s removed or fact-checked and deprioritised in the US. Videos from anti-vaccination influencers that Facebook has banned, such as Del Bigtree and Sherri Tenpenny, have been translated into Arabic, Dutch, German, and other languages, and continue to circulate among Facebook groups and YouTube channels.

The Plandemic video, one of the most prominent pieces of coronavirus-related disinformation over the past year, spread widely in Italian, Polish and Spanish for days after Facebook took action on the English version, according to Renée DiResta, the technical research manager at Stanford internet observatory.

A review by the Institute for Strategic Dialogue, a London-based anti-extremism group, found that Arabic-speaking communities are influenced by content from the US or Europe, “using Arabic subtitling or voice-overs often unencumbered by moderation and fact-checking efforts”. For instance, Facebook removed US-made videos of women who appeared to be shaking in response to the Covid-19 vaccine, but the same content with Arabic subtitles continued to spread without a fact check.

US groups pushing disinformation also provide a “blueprint or how-to guide” for local anti-vaccine groups to follow, according to Ciaran O’Connor, an analyst at the Institute of Strategic Dialogue, a counter-extremism group.

People have used data from the US Vaccine Adverse Event Reporting System (VAERS) to undermine public confidence. The government established VAERS in 1990 as an early warning system to detect potential side-effects. Patients can report reactions independently, and doctors and vaccine makers also submit reports.

Misleading content about side-effects and deaths based on VAERS data is spreading on Dutch, German and Spanish Facebook pages. Conspiracy theorists outside the US have also begun to use VAERS-like programmes in their countries as the raw material for their own claims, O’Connor says.

People disseminating disinformation abroad don’t necessarily feel secure in their ability to continue using mainstream social media and are learning from their US peers how to prepare for their potential expulsion, says Kristina Gildejeva, a researcher for the fact-checking group

Gildejeva has been monitoring anti-vaccine communities in Germany. They were active on Facebook, she says, until recently, when they “followed their English-speaking counterparts and switched to Telegram”.

Bloomberg Businessweek. More stories like this are available on


Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.