subscribe Support our award-winning journalism. The Premium package (digital only) is R30 for the first month and thereafter you pay R129 p/m now ad-free for all subscribers.
Subscribe now
The Apple logo at the entrance to the Apple store on 5th Avenue in Manhattan, New York, US. Picture: REUTERS/MIKE SEGAR
The Apple logo at the entrance to the Apple store on 5th Avenue in Manhattan, New York, US. Picture: REUTERS/MIKE SEGAR

Apple is delaying a system that would have scanned customers’ photos for signs of child sex abuse after fierce criticism from privacy advocates, who feared it could set the stage for other forms of tracking.

The company had announced the feature in early August, along with other tools meant to protect children and root out illicit pornography, and quickly faced concerns that it would create a backdoor through the company’s highly prized privacy measures. Apple scrambled to contain the controversy in the following weeks, saying it would tap an independent auditor to oversee the system, but the outcry persisted.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” Apple said in a statement on Friday. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

The backlash has added to growing scrutiny of Apple in recent months. Earlier this week, the company agreed to change its App Store policies to address criticism that it’s anticompetitive. And employees have become increasingly vocal about problems within the company, including what they say is a lack of pay equity. The US National Labor Relations Board is currently looking into two complaints from workers that originated with concerns about workplace safety and a lack of pay transparency.

Apple had planned a trio of new tools designed to help fight child sex abuse material, or CSAM. They included using the Siri digital assistant for reporting child abuse and accessing resources related to CSAM, as well as a feature in Messages that would scan devices operated by children for incoming or outgoing explicit images. 

The third feature was the most controversial: one that would analyse a user’s library in iCloud Photos for explicit images of children. If a customer was found to have such pictures in their library, Apple would be alerted, conduct a human review to verify the contents, and then report the user to law enforcement.

Privacy advocates such as the Electronic Frontier Foundation warned that the technology could be used to track things other than child pornography, opening the door to “broader abuses.” They weren’t assuaged by Apple’s plan to bring in the auditor and fine-tune the system, saying the approach itself can’t help but undermine the encryption that protects users’ privacy.

In its attempts to defend the new CSAM feature, Apple coached staff about how to field questions on the topic. It also said that the system would only flag cases where users had about 30 or more potentially illicit pictures.

Apple is far from alone in taking such steps. Facebook has long had algorithms to detect such images uploaded to its social networks, and Google’s YouTube analyses  videos on its service for explicit or abusive content involving children. Adobe has similar protections for its online services.

Apple’s CSAM feature would work by assigning a so-called hash key to each of the user’s images and comparing the keys with ones assigned to images within a database of explicit material. Some users have been concerned that they may be implicated for simply storing images of, say, their baby in a bathtub. But a parent’s personal images of their children are unlikely to be in a database of known child pornography, which Apple would cross-reference as part of its system.

Apple also tried to tamp down concerns about governments spying on users or tracking photos that aren’t child pornography. It said its database would be made up of images sourced from multiple child-safety organisations - not just the National Center for Missing & Exploited Children, as was initially announced. The company also plans to use data from groups in regions operated by different governments and said the independent auditor will verify the contents of its database.

It also only affects photos that customers upload to their iCloud accounts. Apple has said it would refuse any requests from governments to use its technology as a means to spy on customers.

The feature had been slated to go into effect before the end of the year, potentially overshadowing a flurry of Apple product announcements that are expected in the coming weeks. The company is rolling out updated iPhones, iPads, AirPods and Macs, as well as a new larger Apple Watch, people familiar with the matter have said. 

Bloomberg. More stories like this are available on bloomberg.com

subscribe Support our award-winning journalism. The Premium package (digital only) is R30 for the first month and thereafter you pay R129 p/m now ad-free for all subscribers.
Subscribe now

Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.