CHRIS ROPER: Wikipedia’s moment of truth
As online encyclopaedia Wikipedia celebrates its 20th birthday, it’s worth reflecting on some of the challenges in the free-knowledge space
In one of my previous incarnations, I was responsible for the MWeb portal, then the biggest website in Africa. At the time, my job title — portal manager — was sneered at by those from traditional media as futuristic jargon.
Now, of course, with the demise of the big walled-garden portals that tried to own and constrain the internet, the job of portal manager is pretty much consigned to the dustbin of media history, like other quaint jobs such as typesetter or switchboard operator.
Among the many content sites we built was one called MWeb Religions, which featured sections for all the major religions in SA. Because we wanted to avoid bias, we made sure to include as many religions as possible, which meant, if memory serves, that we had the Bahá’í faith in there along with the more usual suspects.
The only problem was, though the Christians were keen updaters of their section, most of the other religions couldn’t really be bothered. Which meant the Christian editors used to help update some of the other sections so that they wouldn’t look like empty afterthoughts.
As you may imagine, that’s not an ideal way to avoid bias, unconscious or not. And it’s definitely not a way to achieve representation of voice.
The same thing, it seems, is happening with Wikipedia, that other great source of truth alongside the source documents of the various religions. And more pertinently for our purposes, Wikipedia SA.
Wikipedia has become a bit like the internet itself for people — something you take for granted and use often, but perhaps seldom take the time to think about.
As Wikipedia’s 20th anniversary is on January 15, it’s probably a good idea to refresh our understanding.
According to Wikipedia (yes, very meta, and we should pause to think about that too), "Wikipedia is a multilingual open-collaborative online encyclopaedia created and maintained by a community of volunteer editors using a wiki-based editing system.
"It is one of the 15 most popular websites, as ranked by Alexa, as of January 2021, and The Economist magazine placed it as the ‘13th-most-visited place on the web’.
"Overall, Wikipedia comprises more than 55-million articles, attracting 1.7-billion unique visitors per month."
For many, Wikipedia is the go-to starting point for online research, and is a vital part of the fight to combat misinformation.
Many have pointed out that the best way to push back against the grifters of fake news, propaganda and crazy QAnon balderdash is to pile on the truth. The more truth that’s accessible, and the more it speaks to different people, the more we’re inoculating against people buying into lies and disinformation.
Wikipedia is already being consciously used to fight misinformation. In 2018, "after a barrage of false news reports, both Facebook and YouTube announced they would rely on Wikipedia to help their users evaluate reports and reject false news", the platform notes.
YouTube CEO Susan Wojcicki said the video-sharing site would link to truthful articles on Wikipedia whenever it spotted conspiracy theories and other misinformation.
Washington Post writer Noam Cohen asked: "Is this the same Wikipedia once ridiculed as the encyclopaedia ‘anyone can edit’ and later feared as a prime example of how the internet was empowering a tyranny of the masses to overwhelm genuine expertise?"
Wikipedia today, he continued, "is much more likely to be venerated as evidence of ‘the wisdom of the crowds’, a less-than-shocking concept now that we have become accustomed to seeking out and relying on the contributions of relative strangers".
If we needed an example of how Wikipedia tries to achieve truth, there’s its own description of its failings: "Wikipedia has been criticised for its uneven accuracy and for exhibiting systemic bias, including gender bias, with the majority of editors being male."
These problems of representation are, according to Wikipedia SA’s Douglas Scott, worse in SA.
Scott says there are three main problems. First, there aren’t enough "Wikipedians" (the organisation’s term for volunteer editors) to add SA-related content. This is a general problem, Scott says, but it’s exacerbated when it comes to finding "black and female editors to help edit the project".
Second, there’s not enough diversity of SA knowledge. His example strikes a chord, given that my organisation, which occasionally collaborates with Wikipedia, uses the collective noun "Wikibeards" to refer to the local editors when they visit our offices in Cape Town.
"There is a good number of articles on SA train types, but not enough articles on anti-apartheid activists or Xhosa clans, for example," Scott says.
There is "a particular need for editors to help grow African language Wikipedias in SA and Africa generally. Afrikaans Wikipedia is growing well, and [is] expected to reach 100,000 articles later this year. However, other African-language Wikipedias are in great need of people to help grow them. Xhosa Wikipedia, for example, only has 1,200 articles written by a small pioneering community of only 18 active editors catering to a community of 8.2-million speakers."
The solution to these two problems — more, and more diverse, volunteer editors — is also, perhaps, the solution to the third problem, the problem that interests me most: making sure Wikipedia doesn’t become a space for the disinformation it helps to mitigate.
The free-knowledge movement that Wikipedia is a part of has the laudable aim of providing the ever-shifting sum of human knowledge to everyone. And, unlike Google, that would be ad-free.
The volunteer, peer-review, collaborative, large-scale nature of Wikipedia means that the more people there are who contribute to editing and assimilating that knowledge, the better the quality and reliability of that knowledge and, importantly, the more diversely representative it will be.
Currently, and in an analogous way to the MWeb Religions portal, we’re outsourcing our truths to the dominant demographic. And without imputing deliberate bias to those demographics (white, male, American and so on), "a diversity of views helps reduce the risk that people will miss information that one group views as outrageous, prejudiced, misleading, or just wrong", says Scott.
"An editor in the US, for example, is unlikely to know about the nuances of xenophobia in SA and so it is more difficult for them to spot bad information about that subject."
After the year we’ve just had — after the past few days we’ve just had! — I doubt I need to spend more time convincing anyone about the dangers of misinformation, whether weaponised, inadvertent or mainstream.
While writing columns that are basically the equivalent of stumbling around the digital streets with a meme around your neck proclaiming "The end is nigh" is fun, we do need solutions. And they have to be solutions we can all contribute to, because truth is no longer the word of gods, lawmakers and cultures, but crowdsourced and fungible.
One of those solutions is doing your bit to put the truth out there, whether it’s as an editor on Wikipedia, or the voice of fact-based reason on other platforms.
Oh, and happy 20th birthday, Wikipedia.
Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.
Please read our Comment Policy before commenting.