Picture: 123RF/JOSE JONATHAN HERES
Picture: 123RF/JOSE JONATHAN HERES

New York — TikTok is known primarily as a launch pad for funny memes, dance routines and lip-syncing videos. The company embraces that reputation with a tagline, “the last sunny corner on the internet”. But there’s a dark side to TikTok that engulfs some of the app’s youngest users.

Beneath the surface, TikTok also hosts videos promoting anorexia, bullying, suicide and sexual exploitation of minors. Personalised recommendations, driven by algorithms owned by the parent company Bytedance, often make it harder for parents to track what their children are seeing and for regulators to monitor what children are being exposed to on the app.

“Parents think that TikTok has some redeeming values,” David Gomez, a school resource officer in Idaho, said on the fourth episode of Foundering: The TikTok Story. “Videos, lip syncing, singing, dancing around. OK. I see that stuff. But parents are just not understanding how many predators are on TikTok.”

Parents are just not understanding how many predators are on TikTok.
David Gomez

A spokesperson for TikTok said the company is “deeply committed” to the safety of minors and that it continues to strengthen safeguards. In January, TikTok stopped allowing strangers to comment on videos posted by users under 16 years old. And it restricted the ability to download their videos and changed the default settings on children’ accounts from public to private.

But the problems began years before TikTok even existed. Children flocked to Musical.ly, the precursor to TikTok. Back then, one advertising executive called it “the world’s youngest social network” because its audience included primary schoolchildren.

Safety advocates said TikTok for years prized expansion over the protection of minors. “Their company exploded in growth across the world, and they just didn’t prioritise child safety as they were growing,” said Dawn Hawkins, who runs a US advocacy group called the National Centre on Sexual Exploitation.

Hawkins said she spent months helping an eight-year-old relative get inappropriate videos of him in his underwear taken down from TikTok. She acknowledged that TikTok has made a number of improvements, but said it is still not a safe place for young children to roam unmonitored.

Bloomberg News. For more articles like this, please visit us at bloomberg.com.

subscribe

Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.