Picture: 123RF/Karen Roach
Picture: 123RF/Karen Roach

It’s common cause that Leni Riefenstahl was a gifted filmmaker, her movies well shot and hugely evocative. The problem is, those films were almost pure Nazi propaganda. Their odious content was a covert message, selling the Third Reich’s nonsense — that Aryans were superior to other races — under titles such as Triumph des Willens, or Triumph of the Will.

Sadly, this is just one example of insidious propaganda that uses the medium of the day for its own ends. During the 1994 Rwandan genocide it was radio, where the word "cockroach" took on a deadly meaning.

There’s been an unfortunate evolution of such nefarious use of popular media. In 2019, the propaganda frontier has expanded to the mass media of our day: YouTube, social media and smartphone apps.

The world is experiencing a terrifying backlash against rationality and science. This is especially true when it comes to vaccinating children.

"Antivaxxing" stems from one of the worst scientific frauds of all time. Now-disbarred British doctor Andrew Wakefield falsified research and had it published in The Lancet in 1998. The journal later retracted the paper, claiming it had been "deceived" by Wakefield’s "utterly false" assertion that autism was caused by the measles, mumps and rubella vaccine.

Two decades of research has failed to find any link. But this hasn’t stopped Wakefield’s claims from becoming lore for the conspiratorially minded.

YouTube is rife with antivaxxers spreading irrationality and fearmongering.

But vaccinating all children against these dread diseases protects all society. It’s called herd immunity for a reason.

In recent years there have been numerous outbreaks of measles — including a "record-breaking" one in Europe, according to the World Health Organisation (WHO). This for a disease that was once all but eradicated. "Approximately 110,000 people died from measles in 2017 — mostly children under the age of five years — despite the availability of a safe and effective vaccine," the WHO states.

The disinformation plague has now extended to apps, which are a powerful form of communication.

A popular children’s developmental app, The Wonder Weeks, is based on a book of the same name. It claims a child’s development follows predictable "leaps" — usually prefaced by grumpiness — and catalogues what developmental stage the child is experiencing.

But its founder, professor Frans Plooij, was dismissed from the University of Groningen in the Netherlands when research by one of his own PhD students negated the book’s claims and he tried to quash publication of her findings. The app is still for sale and has been downloaded more than 500,000 times on Android alone.

Another recent case is the Femm app, which purports to be about female health and fertility, but is "funded and led by anti-abortion, anti-gay Catholic campaigners", The Guardian newspaper revealed last month. Downloaded more than 400,000 times, the "popular women’s health and fertility app sows doubt about birth control [and] features claims from medical advisers who are not licensed to practise in the US".

Other regular fodder includes conspiracy theories that the moon landing was faked, that US school shootings such as Sandy Hook (2012) and Parklands (2018) didn’t happen, and that the Earth is flat.

Right-wing hatemonger Alex Jones, whom Facebook finally banned last year, claimed on his InfoWars channel that the shooting at Sandy Hook was a hoax created by the gun-control lobby. His supporters started harassing the parents of these deceased children — and social networks did nothing.

It took Twitter six years of appeals by one of those parents, Leonard Pozner, to suspend the account of a regular InfoWars interview guest, Wolfgang Halbig, who claimed the massacre hadn’t happened. Halbig badgered parents and demanded records for the tragedy. "Wolfgang does not wish to speak with you unless you exhume Noah’s body and prove to the world you lost your son," a fellow hoaxer e-mailed Pozner, according to The New York Times.

Then there’s a 104-minute documentary on YouTube, Conspiracy Theories with Shane Dawson, which includes all the usual paranoid theories, including that the Earth is flat. It "kind of makes sense", Dawson has previously said of this pre-Copernicus bunkum. He has 20-million followers, while this video got 30-million views.

Researchers at the Flat Earthers annual conference for the past two years found that attendees asked: "Where is the curve?" and "Why is the horizon always at eye level?"

Lead researcher Asheley Landrum, from Texas Tech University, says: "Their algorithms make it easy to end up going down the rabbit hole, by presenting information to people who are going to be more susceptible to it."

YouTube’s business model requires its viewers to spend as much time as possible watching videos, and therefore watching adverts. The bigger problem is that its recommendation algorithm suggests increasingly controversial content, as multiple surveys have found.

Guillaume Chaslot, who worked as a YouTube software engineer, created the AlgoTransparency.org website to show how this algorithm works. He found viral hoaxes and conspiracy theories are the top recommended videos on about 1,000 popular YouTube channels.

Almost too little too late, YouTube announced in January that it would alter its recommendations algorithm to prevent spreading "borderline content and content that could misinform users in harmful ways". This includes "videos promoting a phony miracle cure for a serious illness, claiming the Earth is flat or making blatantly false claims about historic events like 9/11".

These conspiracy theories are what researchers Danah Boyd and Michael Golebiewski call a "data void", which can be "exploited by those with ideological, economic, or political agendas".

Unfortunately, national regulators have failed to pick up on these issues. But, then, they have failed to identify most of the problems associated with the consolidation of power by Big Tech and social media firms.

Hobby-sharing site Pinterest, of all platforms, has managed to solve the antivaxxing problem. All it has done differently is ban antiscientific nonsense outright. Facebook, Twitter and YouTube, in contrast, "downrank" or "de-emphasise" such posts — but don’t remove them. Since February, if you searched for antivaxx info on Pinterest no search results were returned.

YouTube, Facebook and Twitter could just as easily do the same but they say they aren’t the arbiters of what’s true and what’s not.

Facebook CEO Mark Zuckerberg infamously said in July 2018 that, even though he finds Holocaust deniers "deeply offensive", he doesn’t think "they’re intentionally getting it wrong".

What he didn’t say is that Facebook gets enormous traffic out of such drivel. And, the greater the number of eyeballs on the app, the more advertising revenue it’s getting.

It’s about profit.

But, like other social networks, Facebook also fears a backlash from the alt-right and conservatives in the US who claim they are being censored. Controversial content plus controversy equals even more advertising opportunities.

As Apple CEO Tim Cook said earlier this month, the tech sector has a lot to answer for: "This industry is becoming better known for a less noble innovation: the belief that you can claim credit without accepting responsibility."

For social media users wary of being caught up in all this, the rule is simple: if it seems too good to be true, it likely isn’t true at all. The same goes for stories with headlines that include: "… left him speechless", and "you won’t believe what happened next". When in doubt, turn to fact-checking website Snopes.