The real reason everyone hates Apple’s child porn idea (it has nothing to do with privacy)
It’s about something irrationally human. And Apple needs to back down.
Apple announced last week that as part of a broader initiative to fight child sexual abuse, the company plans to automatically scan all user photos stored on iCloud and compare it to a known repository of what we’re now calling “child sexual abuse material,” or CSAM (pronounce it: SEE-sam). When a few objectionable pictures are detected, they’re sent to Apple moderators who actually view the pictures, who are directed to report offending users to the National Center for Missing and Exploited Children.
Here’s where it gets weird. Apple’s system won’t scan all images stored on the phone — only pictures uploaded to iCloud. But it won’t scan the photos in iCloud — only locally on the phone. To do that, Apple plans to download the entire database of child porn image identification data is downloaded to every Apple device. It’s unreadable by humans.
Specifically, your iOS devices will apply on your device a digital “fingerprint” called a NeuralHash to each of the photos that is uploaded to iCloud. This “fingerprint” is compared with an non-visual database of child abuse images. Each photo match triggers a kind of “red flag” called a “safety voucher.” If several of your photos trigger “safety vouchers,” Apple will decrypt the photos and look at them, then report you if they’re deemed to show child sexual abuse.
The practice of scanning for CSAM is common — Facebook, Twitter and Reddit all do it for pictures uploaded to their services.
The only difference is that Apple plans to do it on your phone, rather than in the cloud. And that’s what’s driving the controversy.
This makes no sense
Fighting child sexual abuse is a righteous cause that everyone favors.
Most users are ignorant of, apathetic about, in favor of or accepting of running CSAM scans in the cloud. It’s a de facto acceptable practice.
Apple’s approach is weird and irrational. If they’re running CSAM scans on devices, why aren’t they running them on all the photos store on the device instead of only those photos on the device that are also uploaded to iCloud servers?
Committed pedophiles will just turn off iCloud backup. Everyone else gets scanned.
If it makes no sense, then why is Apple doing it?
Apple touts privacy as a central differentiator with other Silicon Valley companies and other smartphone makers. While tech companies like Facebook, Amazon, Google and others routinely Hoover up gigabytes of data on every user in order to personalize, advertise to and generally monetize users.
Apple believes it has found a kind of cheat code: Process personal data on devices, rather than transmitting personal information to a central processing center and processing it there. As one example, Apple runs face recognition AI to know who is in your photos — just like Facebook and Google do — but uniquely Apple runs the AI on your phone, rather than in the cloud.
In keeping with Apple’s policy to do advanced user data processing on phones, Apple wants to do the same thing for CSAM detection.
If everyone does it in the cloud, why do people object to it on the phone?
Tech pundits, bloggers, podcasters and other influencers say they hate Apple’s plan because it violates privacy.
Another point of objection is the slippery slope argument. If they can scan for CSAM now, what’s next? Looking for copyright violations?
And yet another is that it makes users vulnerable to being targeted by planted photos.
Apple’s plan to run CSAM scans on devices is a bad idea, but not for the reasons everyone says.
None of these objections are valid. First, it doesn’t violate privacy. No human looks at photos unless there’s overwhelming evidence that several photos on a device are known CSAM images.
Second, there’s no slippery slope here. Scanning photos for CSAM is commonplace, and doing it on device doesn’t not represent some bold new direction.
And, third, it’s actually easier to plant images on someone’s cloud account or their phone (to be auto-uploaded to the cloud where it will be scanned) than it is to just get it onto the phone.
The reason people believe these are their objections is that, well, we’re irrational primates who get bothered by things and then look around for reasons why we’re bothered.
If you really think about it, it’s pretty clear that the response is an emotional, visceral one. Here’s why I believe people hate this idea:
People don’t want child porn on their devices, even if it’s unreadable
The idea that a database of information that describes, identifies or specifies images of children being sexually abused is to be carried around in our pockets is revolting. This database is readable only by machines. But the normal and natural human reaction to acts like child sexual abuse, cannibalism or any number of other horrors is the desire for distance. We want first of all to get away from it. We don’t want to carry around anything even remotely related to it in our pockets and purses.
Users think they own the software
Further driving home the sense of invasion is the intuitive belief by people that they own the phone and everything on it.
In fact, legally (and “ownership” is a legal concept) you own your phone’s hardware, but not the software. Apple isn’t cagey or evasive on this point. They’re very explicit about it. Their software license, which you agreed to when you set up your iPhone, says:
[Apple software] is “licensed, not sold, to you by Apple Inc. for use only under the terms of this License… Apple and its licensors retain ownership of the Apple Software itself”…
As such, to Apple, the distinction between iCloud and inside the operating system of your phone are identical — Apple owns both.
But users feel like the phone and everything on it is not only their own personal, private property, it also feels like our phones are really part of us. Apple’s intention to download a child porn database feels totally wrong.
OK, Mike, so what’s the solution?
User concern is irrational. But so is Apple’s plan.
Apple intended to pander to user irrationality by saying that iCloud-uploaded photos scanned on device would benefit user privacy and security because on device processing of user data is always more private than in the cloud. But that makes no sense in this case, because they intend only to scan photos that are uploaded to the cloud anyway.
Because Apple is going to scan only pictures uploaded to iCloud, Apple should scan them in iCloud, not on the phone.
By doing what other companies do — CSAM-scanning media in the cloud rather than on devices — Apple would immediately end this controversy without changing the outcome in any way. Pictures uploaded to iCloud would be scanned just as before. And users trafficking in CSAM would be reported just as before.
It’s time for Apple to relax its on-device dogma for a change and do this one in the cloud.