On Thursday, we noted a claim that Apple Inc. (NASDAQ: AAPL) was preparing to introduce tools that can be used to identify child abuse images on iPhones and other products. MIT professor and computer security expert Matthew Green was not entirely sanguine about the project. The New York Times’s Dealbook quoted Green:
I don’t particularly want to be on the side of child porn and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends.
[in-text-ad]
He got more company as the day wore on. The Electronic Frontier Foundation (EFF) warned that Apple is “planning to build a backdoor into its data storage system and its messaging system.” This is not something the privacy advocates, like the EFF, support.
Apple posted a web page on Expanded Protections for Children introducing three new child-safety features:
First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.
[Second], iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM [Child Sexual Abuse Material] online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.
[Third], updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.
Apple noted that these features will be available later this year in iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.
EFF acknowledges that child exploitation is a “serious problem,” but Apple’s decision to “bend” its robust privacy protection measures to combat the problem is a “choice [that] will come at a high price for overall user privacy.” The client-side program that Apple is proposing to scan images on iPhones, iPads, Apple Watches and Macs cannot be restricted to scanning only from CSAM images.
[I]t’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.
All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.
We can’t say we weren’t warned. In an interview with the New York Times last May, Harvard professor emerita Shoshana Zuboff cited a data scientist who told her, “Look, the underlying norm of all software and apps designed now is data collection.” Zuboff noted that every app is “designed to engage in surveillance,” and had this to say about Apple:
Apple still makes the majority of its revenues through its sales of iPhones and other devices. Nevertheless, an increasing portion of its revenue comes from services, and a big chunk of services is selling apps. So even if it’s not a surveillance capitalist, it is a powerful enabler. A powerful accessory to this crime of surveillance capitalism.
And, of course, there are other ways in which Apple and Mr. Cook really violate the principles that he so eloquently states. Apple in China is obviously a huge example of that. Apple’s relationship with Google. So Apple is deeply compromised.
Zuboff published The Age of Surveillance Capitalism in 2019, a book the Financial Times named the book of the year. Her first book, In the Age of the Smart Machine, was published in 1988 and remains in print.
Thank you for reading! Have some feedback for us?
Contact the 24/7 Wall St. editorial team.