The video game live-streaming site Twitch has struggled to keep underage kids from opening accounts. Picture: BLOOMBERG
Loading ...

Twitch, the video game live-streaming site popular with teens and children, is making changes on the platform to increase safety for its young users after criticism that it enables child predation.

Amazon.com-owned Twitch said on Tuesday it has introduced mandatory phone verification requirements and fortified technology used to catch and terminate accounts belonging to people under 13, among other measures. 

“Grooming is particularly insidious because it can be hidden in plain sight, and there are fewer established industry practices for detecting it,” Twitch said in a blog post. “These predators are not welcome and will not be tolerated on Twitch, and today we’re sharing an update regarding the continuous work we’re doing to combat them.” 

Bloomberg News published a report in September describing rampant child predation on Twitch and the platform’s insufficient moderation tools. Bloomberg analysed 1,976 Twitch accounts with follower lists made up of at least 70% children or young teens. More than 279,016 apparent children were targeted by predators, according to data collected by a researcher who studies live-streaming websites.

The researcher asked to remain anonymous due to concerns over potential career repercussions from being associated with such a disturbing topic.

In a subsequent analysis over the past month, Bloomberg has discovered new predatory accounts and more children being targeted.

After the report, UK internet regulator Ofcom contacted Twitch to discuss its poor protections for children on the platform. “We are actively reviewing whether Twitch’s measures are sufficiently robust enough to prevent the most harmful material being uploaded,” an Ofcom spokesperson said in an email. 

Critics say the root of child predation on Twitch has been the ease with which children can lie about their age, sign up for an account and immediately live stream themselves to anonymous and unquantifiable audiences. YouTube and TikTok require users to possess a certain number of followers before live-streaming on mobile devices. TikTok, owned by ByteDance, recently announced plans to increase its age requirement for live-streaming to 18 from 16, effective November 23, and Alphabet’s YouTube doesn’t “list”, or make searchable, mobile live-streams from users under 17 by default. 

Early steps

Twitch still lags behind competitors regarding age verification and barriers to live-streaming for children. Unlike others, Twitch hasn’t required two-factor authentication for users signing up on mobile. With the latest updates, Twitch will now require at least one phone verification before live-streaming, which it says will help block children who previously had been suspended for streaming while underage from creating new accounts. According to a new analysis of the predatory accounts in the data set, these accounts are still finding and after on average hundreds of apparent children a day.

“In the face of a tsunami of new legislation about the world, it is good to see some online services taking early steps towards adopting privacy-preserving age assurance methods, but none of the major global platforms most popular with children has yet adopted sufficiently comprehensive, audited age cheques to keep children safe online,” said Iain Corby, executive director at the Age Verification Providers Association, a global trade body.

“Many still underestimate the risks their sites pose to young users, particularly through enabling contact with dangerous adults about which parents often have little or no awareness.”

It’s notoriously challenging to moderate live video in real time. Twitch broadcasts more than 2.5-million hours of live content in 35 languages daily. The site’s moderation relies heavily on user reports, which partly resulted in the May terrorist attack in Buffalo, New York, live-streaming for 24 minutes on Twitch. The site was able to stop the live stream by the shooter within two minutes after violence began, according to a 47-page report from New York State attorney-general Letitica James on the role Twitch and other online platforms played in the mass shooting.

Refining technology

Twitch now says it’s refining the technology human moderators use to review and take action on reports regarding children under 13. 

Twitch’s discoverability features, which have helped expand its ecosystem of creators, have also made it easy for predators to find children. Children who are streaming from their home bedrooms on their mobile phones can attract dozens or hundreds of viewers within minutes, including child predators who ask for live sexual acts through Twitch’s chat feature.

Two predators watching a child’s live stream in November said they discovered the children through Twitch’s “recently started” feature, which reveals accounts with low follower numbers, often indicative of child accounts. The live stream attracted 165 viewers within 35 minutes, according to a Bloomberg analysis. 

In 2020, Twitch removed its “recently started” feature, which can make underage accounts easier for predators to identify, for about two content categories, but the activity persists in almost all others. On Tuesday, Twitch said it’s “expanding the signals we use to catch and terminate accounts belonging to users under 13”.

“There are a lot of people who are frankly very upset,” said Tom Verrilli, Twitch’s chief product officer, in an October interview at TwitchCon, the company’s developer conference. “Like on the rest of the internet, we understand that there are people who want to use Twitch for harm. We are always building against that.”

Bloomberg News. More stories like this are available on bloomberg.com

Loading ...
Loading ...
View Comments