iPhone users have put up with a lot in recent months but the company’s new CSAM detection system has proved to be a lightning rod of controversy that stands out from all the rest. And if you were thinking of quitting your iPhone over it, a shocking new report might just push you over the edge.

In a new editorial published by The Washington Post, a pair of researchers who spent two years developing a CSAM (child sexual abuse material) detection system similar to the one Apple plans to install on users’ iPhones, iPads and Macs next month, have delivered an unequivocal warning: it’s dangerous. 

“We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous,” state Jonathan Mayer and Anunay Kulshrestha, the two Princeton academics behind the research. “Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.”


This has been the predominant fear regarding Apple’s CSAM initiative. The goal of the technology to reduce child abuse is indisputably important but the potential damage that could come from hackers and governments manipulating a system designed to search your iCloud photos and report abusive content, is clear to all. 

“China is Apple’s second-largest market, with probably hundreds of millions of devices. What stops the Chinese government from demanding Apple scan those devices for pro-democracy materials?” ask the researchers. 


And critics have plenty of ammunition here. Earlier this year, Apple was accused of compromising on censorship and surveillance in China after agreeing to move the personal data of its Chinese customers to the servers of a state-owned Chinese firm. Apple also states that it provided customer data to the US government almost 4,000 times last year. 

“We spotted other shortcomings,” Mayer and Kulshrestha explain. “The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny.” 


And recent history doesn’t bode well. Last month, revelations about the Pegasus project exposed a global business which had been successfully hacking iPhones for years and selling their technology to foreign governments for surveillance of anti-regime activists, journalists, and political leaders from rival nations. With access to Apple technology designed to scan and flag the iCloud photos of a billion iPhone owners, this could go a lot further. 


Prior to Mayer and Kulshrestha speaking out, over 90 civil rights groups worldwide had already written a letter to Apple claiming that the technology behind CSAM “will have laid the foundation for censorship, surveillance, and persecution on a global basis.”


Apple has subsequently defended its CSAM system, claiming it was poorly communicated and a "recipe for this kind of confusion" but the company’s responses did little to impress Mayer and Kulshrestha. 


“Apple’s motivation, like ours, was to protect children. And its system was technically more efficient and capable than ours,” they said. “But we were baffled to see that Apple had few answers for the hard questions we’d surfaced.”


Now Apple finds itself in a mess of its own making. For years, the company has put considerable effort into marketing itself as the champion of user privacy with the company’s official privacy page declaring: 


“Privacy is a fundamental human right. At Apple, it’s also one of our core values. Your devices are important to so many parts of your life. What you share from those experiences, and who you share it with, should be up to you. We design Apple products to protect your privacy and give you control over your information. It’s not always easy. But that’s the kind of innovation we believe in.”


CSAM will launch on iOS 15, iPadOS 15, watchOS 8 and macOS Monterey next month. I suspect for many Apple fans, it will mark the moment to walk away.

No comments:

Post a Comment