Is this Facebook’s ‘Big Tobacco’ moment?
Despite the need to be more transparent about the effects of social media on users, Facebook executives defend its actions
Facebook executives have long boasted that its platforms are safe, even as they invested in ways to keep teenagers hooked and hid what they knew about the side effects.
Sound familiar? Critics say Big Tobacco once used the same playbook, and it is fuelling a whole new level of outrage against the social media giant.
Facebook consistently played down its own research that showed how photo-sharing app Instagram can harm the mental wellbeing of its youngest users, according to a report in the Wall Street Journal. Almost a third of young teen girls told Facebook they feel worse about their bodies after scrolling through the site, documents reviewed by the newspaper showed.
Despite that knowledge, Facebook is dedicating more resources to reaching even younger consumers, including developing a children’s version of Instagram.
The revelations are prompting legislators to compare Facebook’s actions to a decades-long campaign by the country’s biggest tobacco companies to mislead the public about the cancerous and habit-forming effects of cigarettes.
“Its executives knew about the addictive chemicals in tobacco and yet they did nothing to try to keep the product out of the hands of children,” says representative Bill Johnson, an Ohio Republican.
“They knew that if they could get children addicted early, they’d have a customer for life. It’s very much the same way — children, young people, are addicted to these platforms, and you can see report after report on the damage that’s being done.”
They knew that if they could get children addicted early, they’d have a customer for life.Bill Johnson
The long-term effects of social media are exactly what is driving concerns about Facebook’s plan to build an Instagram for children. The service, sometimes called Instagram Youth internally, is intended to give preteens an entrance ramp onto social media until they turn 13 and are allowed to join the main site.
Facebook argues that children are lying about their age to get on Instagram anyway, so a youth-orientated product — with parental controls — would be a safer alternative.
More than three dozen state attorneys-general have already urged Facebook CEO Mark Zuckerberg to drop the project, arguing that Instagram Youth could contribute to conditions such as depression, loneliness and anxiety. So have US legislators and a coalition of privacy and child welfare advocates.
“If Facebook goes ahead with Instagram Youth, then really what we’re saying is they’re accountable to no-one,” says Josh Golin, executive director of Fairplay, a nonprofit dedicated to ending marketing aimed at children.
To understand how children’s mental wellbeing is affected by Instagram, Facebook surveyed tens of thousands of users and mined its own data over the past three years, according to the Journal, which based its reporting on internal Facebook research that the publication obtained.
The review found that users felt under pressure to present an idealised version of themselves on Instagram, and that it often led them to make negative comparisons of themselves with others. Internal researchers warned that Instagram’s design led young people towards potentially harmful content on the platform.
During a March 2021 congressional hearing, Zuckerberg was not as forthcoming about the evidence on the effects of social media on mental health, boasting that online connections can help people feel less lonely.
When asked whether Facebook had internal research on the impact of its platforms on children, Zuckerberg said it was something they “try to study” before adding: “I believe the answer is yes.”
The research on social media and mental health can be ambiguous. Some studies show a link between heavy use and childhood depression, lower self-esteem and suicidal tendencies. Other academics argue the correlation between social media use and poor mental health outcomes is weak and that other factors could be at work.
Experts in both camps agree that Facebook is best positioned to conduct the highest-quality research, because it knows exactly what its users are doing on Instagram and for how long.
On September 14, Karina Newton, Instagram’s head of public policy, wrote a blog post highlighting the similarly inconclusive nature of the company’s research.
“Social media isn’t inherently good or bad for people. Many find it helpful one day, and problematic the next,” she wrote. Instagram is looking into ways to steer vulnerable users away from certain types of posts and “towards content that inspires and uplifts them”, she added. A Facebook representative declined to comment beyond the contents of the blog.
Big Tobacco’s strategy was, for decades, to cast doubt on public-health research. A full-page advertisement published nationwide in newspapers in January 1954 established the industry’s public messaging for the next 50 years: Smoking was not a proven cause of lung cancer, and more research on cigarettes and health was needed.
While the tobacco companies questioned and distorted scientific data, their own research recognised the risks. They also understood that nicotine was addictive, even as they publicly denied its effects to avoid regulation and thwart legal liability from smokers. It was a whistle-blower — an executive from the Brown & Williamson Tobacco company — who helped expose the industry’s secrets.
The idea that information can actually cause harm is not something that we as Americans can get our head around.Asha Rangappa
Facebook’s strategy is to make its platforms more addictive, just as cigarette companies did with additives, the company’s former director of monetisation, Tim Kendall, told Congress last year.
Facebook relies not just on likes and updates to keep users hooked, but also on misinformation and conspiracy theories that provoke a strong reaction, he said. “These services are making us sick.”
It took decades for the government to hold Big Tobacco to account. In 1998 a coalition of states reached a $246bn settlement with the industry that required companies to make annual payments to the states and limited the visibility of cigarette advertising. A year later the US Department of Justice sued the tobacco companies, accusing them of a racketeering conspiracy to defraud the public.
In 2006, after a nine-month trial, a federal judge in Washington agreed, saying that the companies “marketed and sold their lethal product with zeal, with deception, with a single-minded focus on their financial success, and without regard for the human tragedy or social costs that success exacted”. Dozens of states passed laws banning smoking at restaurants, bars and workplaces.
The percentage of high school students who smoked frequently plummeted to 2.6% in 2017 from 16.7% in 1997, a report from the Centers for Disease Control and Prevention (CDC) showed. Still, about 34-million adults in the US remain smokers, according to the CDC.
Fixing social media’s ills will require a similar shift in public awareness, says Asha Rangappa, who teaches a course on social media and information warfare at Yale University’s Jackson Institute for Public Affairs. “The idea that information can actually cause harm is not something that we as Americans can get our head around,” she says.
On Capitol Hill, representative Johnson is working on a bill instructing the National Institutes of Health to study the mental health risks of social media and whether to apply a warning label to the tech platforms. Senators Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) said they were in touch with a Facebook whistle-blower and “will use every resource at our disposal to investigate what Facebook knew and when they knew it”.
In recent days, Facebook executives have defended its actions but have yet to publicly release more internal studies. Nick Clegg, Facebook’s vice-president of global affairs and communications, pledged the company would continue to invest in research on complex issues and “improve our products and services as a result”.
Blumenthal, chair of the senate subcommittee on consumer protection, product safety and data security, plans to hold hearings on Facebook’s knowledge of its harmful effects.
“We’re at a turning point, because the analogy to Big Tobacco is very apt,” says Blumenthal. “It’s not just that they were doing harm, but they knew it and they concealed it, which is what makes it all the more hideous because people became addicted and the harm was compounded.”
Bloomberg News. More stories like this are available on bloomberg.com
Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.
Please read our Comment Policy before commenting.