World wide web inventor Sir Tim Berners-Lee speaks during the inauguration of Web Summit in Lisbon, Portugal, on November 5 2018. Picture: REUTERS/PEDRO NUNES
World wide web inventor Sir Tim Berners-Lee speaks during the inauguration of Web Summit in Lisbon, Portugal, on November 5 2018. Picture: REUTERS/PEDRO NUNES

Tim Berners-Lee — considered the father of the world wide web — recently published a “contract for the web”, prompted in large part by how he and like-minded individuals are despairing of the state of the internet.

Yes, they want to get more people online, and yes, it is a great source of knowledge, collaboration, creativity and innovation. But at the same time, it’s too often a cesspool of dodgy data ethics, trolling and a digital water cooler for extremists of all kinds to lurk and find commonality.

So how would you characterise the state of the web? Sometimes it’s useful to trudge through some history to see the current form for what it is, and as luck would have it, this week I chanced upon some of my old blogs to remind me of days of yore. To my shame, I was that person practising my reporting and writing skills, but also — if I’m honest — navel-gazing in public. I started blogging at about 20 years old, and I managed and contributed to many (personally and professionally) for the next 10 or so years.

Stumbling across these sites served as a reminder of a time when it was rare — or perhaps, rarer — to be an online content contributor. This was, roughly, the very end stages of “Web 1.0”. In that iteration, most people were passive consumers of content online.

After about 2000 this started to rapidly shift, to the point where nearly every man and his dog had a blog (usually about his dog). Then 2003 gave us MySpace, and 2004 Facebook. Soon our online endeavours became what might (kindly) be called “microblogging” through posts, statuses and tweets. I often joke that my main hobby these days is deleting the things I put on Facebook in its early days. “Kate is looking forward to this weekend”. Great. Thanks, Past Kate, for that exciting contribution.

Web 2.0 included shifting software applications off our local machines and onto the web. Moreover, it featured the ability to contribute to and collaborate on websites, such as social networking platforms.

But not everyone was sharing inane observations and family photos. This era also saw the rise of sites such as 4chan, where anonymity isn’t as much an option as it is the raison d’être. This dank corner of the web — and a few others like it — position anonymity as sacrosanct, like a return to the “old ways” of chat groups. 4chan quickly devolved from anime discussions to ... well ... a urine-warm paddling pool of perversion, patriarchy and white supremacy chatter.

In 2013, software developer Fredrick Brennan launched 8chan, an alternative to what he called the now-too-restrictive (the mind boggles) 4chan. Within five years this digital sewer got so rank that even Brennan exited, and last year he publicly condemned it and called for its closure. This happened but, predictably, it re-emerged as 8kun in November 2019. This is the Parktown prawn of imageboard sites. It just won’t die.

The point here is not just to shake my fist in anger, but to acknowledge how these types of sites directly contributed to the growth of alt-right communities and, ugh, incels. It has been well-documented and researched, but here are a few illustrative examples:

  • Alek Minassian (who drove a van into a crowd in Toronto in 2018, killing 10) made reference to 4chan in a post on the day of the attack.
  • Californian spree killer Elliot Rodger was shown to be a frequent 4chan user, and used YouTube to publish his video “manifesto” explaining his “justification” for what he would go on to do. He is — and I genuinely struggle to even type this — considered a hero by many in incel communities.
  • In March 2019, a white supremacist in New Zealand live-streamed his attack in which 59 people were killed and many injured. This video was then shared across other platforms, skipping ahead of efforts to shut it down.

This was, presumably, not the way web pioneers hoped things would go. When analysts and industry experts started to imagine what Web 3.0 would be, they talked of decentralisation and free expression, and the Semantic Web, which was going to see machines processing and understanding data in the relational, contextual way people do.

This dream hasn’t been delivered. Yes, we have smarter tech, things like Siri and other digital personal assistants that can execute an action or process data, but a machine that makes meaning, not so much. A pessimist might say 3.0 was instead all about trolling and hate speech, exclusionary and discriminatory ideas shared by people emboldened by echo-chamber communities.

Unfortunately, there is no technology or platform that has a solid handle on managing the mayhem. Which is why Berners-Lee and co are asking citizens, companies and governments to commit to their proposed contract — a set of nine principles focused on safeguarding privacy and building tech and communities that enable human dignity and civil discourse. Can the contract save us from ourselves? Time will tell.

• Thompson Ferreira is a freelance journalist, impactAFRICA fellow and WanaData member.

Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.