subscribe Support our award-winning journalism. The Premium package (digital only) is R30 for the first month and thereafter you pay R129 p/m now ad-free for all subscribers.
Subscribe now
Picture: 123RF/BIGTUNA ONLINE
Picture: 123RF/BIGTUNA ONLINE

As Facebook weathers yet another scandal, this time fuelled by its internal research on the effects of Instagram, I’d like to focus on something slightly different that should be a scandal, too: the quality of that internal research.

Facebook has been pushing back against a story in The Wall Street Journal, which cited a leaked internal report suggesting that Instagram harms teenagers by fostering insecurities around “social comparison” and sometimes even suicidal thoughts. In its public-facing blog, the company featured an article entitled “What our Research Really Says About Teen Well-Being and Instagram.” It pointed out ways in which the app was found to be benign or mildly positive, and also sought to downplay the significance of the research, noting that it “did not measure causal relationships between Instagram and real-world issues,” and sometimes “relied on input from only 40 teens.”

Facebook is right on one point: Its internal research doesn’t demonstrate much of anything. This just isn’t how science is done. One never relies on a single study to determine a relationship, in part because any single experiment entails too many choices that limit its applicability. Do you study teenagers or young adults? Men or women? How do you reach them? What questions do you ask? Do you follow up after six months or 12? And that’s just for design, let alone analysis. Only when multiple studies with different approaches get the same answer can one start to draw strong conclusions.

That said, one can reach a pretty strong conclusion by observing the way Facebook has done research over the years: It’s afraid to know the truth. After all, why not do more studies? If it’s possible that your product is leading young women to kill themselves, wouldn’t you want to explore further, at least to clear your name? Why not let outside researchers use your data, to get a better answer faster? Instead, Facebook allows only tiny internal studies and tries to keep them under lock and key. Even if they leak, the company maintains deniability: The results are far from conclusive.

Facebook is not alone in its aversion to self-knowledge. Something similar happened at Google not so long ago, when internal researchers had the audacity to think they were able to do critical work, writing a paper on how large language models like those used at the company can be environmentally damaging and even racist. In that case, Google fired two of the founders of its Ethical Artificial Intelligence team, Timnit Gebru and Meg Mitchell.

I’m not so naive as to think that public embarrassment will compel the big tech companies to allow access for real science on the impact of their products. Making that happen is a job for Congress or the Federal Trade Commission. In the meantime, as the likes of Facebook and Google sweep through academic departments with job offers for newly minted PhDs and even seasoned professors, those who choose to take the money should recognise that what they’re getting into isn’t scientific inquiry — and still leak whatever they can.

O’Neil is a Bloomberg Opinion columnist. She is a mathematician who has worked as a professor, hedge-fund analyst and data scientist. She founded ORCAA, an algorithmic auditing company, and is the author of “Weapons of Math Destruction.”

Bloomberg. More stories like this are available on bloomberg.com/opinion

subscribe Support our award-winning journalism. The Premium package (digital only) is R30 for the first month and thereafter you pay R129 p/m now ad-free for all subscribers.
Subscribe now

Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.