We've got news for you.

Register on BusinessLIVE at no cost to receive newsletters, read exclusive articles & more.
Register now
A Facebook logo is displayed on a smartphone. Picture: REUTERS/DADO RUVIC
A Facebook logo is displayed on a smartphone. Picture: REUTERS/DADO RUVIC

A product manager who’d worked at Facebook turned over reams of pages of internal documents to government officials and the Wall Street Journal earlier in 2021 because, she said, she’d reached the conclusion that the company was unable to change on its own.

When there were “conflicts of interest between profits and the common good and public safety, Facebook consistently chose to prioritise its profits,” Frances Haugen told Congress at a public hearing on October 5. “I realised we needed to get help from the outside, that the only way these problems would be solved is by solving them together, not solving them alone.”

The hearing came after weeks of damaging revelations based on the documents Haugen shared with the Journal and the day after a technical problem took down Facebook’s core products for hours, contributing to a sense of crisis at the company. Haugen told Congress what she thought Facebook had to do to make its main social network and its Instagram photo-sharing platform safer, healthier, and less polarising. There are lots of ideas on this subject — Congress and federal enforcement agencies have spent the last several years formulating their own plans to hold Facebook to account — and Haugen’s suggestions didn’t always align with those of the company’s harshest critics.

Haugen’s overarching point: Facebook needs to make big product changes that, she said, would probably lead to less engagement, a decline in advertising dollars, and ultimately less profit. She called out one of the core aspects of the social network’s experience, its “engagement-based ranking” system that determines what people see in their news feeds. In 2018, Facebook prioritised engagement as part of a shift towards “meaningful social interactions,” which the company said would lead to more visibility for posts from friends and family and less for brands and publishers. The effect, according to Haugen, was to emphasise content that elicited strong emotional reactions, even if it was hateful or polarising.

Instead of engagement, Haugen would prefer that Facebook show content mostly in chronological order. It already offers this option but doesn’t allow users to make it their default setting. Company executives have said chronological feeds would be less relevant to many users than one in which its algorithms choose personalised content.

Another idea that Facebook has partially implemented is the removal of “like counts” showing how many people reacted to posts on Instagram, a feature that’s been linked to the social anxiety that the product causes, particularly in teens. After toying with the idea for years, the company said recently it would allow people the option to hide the like counts. Haugen called for more sweeping action, saying minimum ages for social networks should be raised to as high as 18 years, because young people don’t have the self-regulatory skills to adjust their behaviour. The platform’s algorithm has “feedback cycles where children are using Instagram to self-soothe but then are exposed to more and more content that makes them hate themselves,” she said.

Facebook’s own research about the harms of youthful Instagram use was one of the most explosive aspects of the Journal series based on Haugen’s revelations. In a statement, Nick Clegg, Facebook’s vice-president for global affairs, said the reports contained “deliberate mischaracterisations” without offering specific details. The company said in another statement that Haugen wasn’t high enough on its totem pole to understand its approach to the issues she testified about. It repeated its call to reform an important 1996 law governing social media sites.

Legislators have introduced several bills targeting large technology companies, some of which deal with aspects of product design. While members of both parties agree on the need for stronger privacy protections, they differ on the more politically charged issue of content moderation, with Democrats focused on the spread of misinformation and Republicans accusing technology companies of bias against conservative views.

There is already bipartisan support for bills confronting the market power of Facebook and a handful of other huge tech companies, and the Federal Trade Commission has challenged the company’s acquisition of Instagram and instant-messaging platform WhatsApp. Haugen said breaking up Facebook wouldn’t help. Doing so, she argued, would lead only to smaller companies with the same privacy and transparency problems. And without Instagram to drive its growth, Facebook would stagnate and “continue to be this Frankenstein that is endangering lives around the world.”

Bloomberg BusinessWeek. More stories like this are available on bloomberg.com


Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.