I’m admittedly very torn about this.
that starting in iOS 15 (due in ~1 month) and only in the United States (at least at first), they will begin scanning photos (more accurately, anonymous hashes
of photos) that users upload to iCloud (aka, what most people with iOS devices do) and compare them against known images of child sexual abuse, working directly with National Center for Missing & Exploited Children (NCMEC).
“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” the Financial Times said. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”
Now, obviously, child sexual abuse material is abhorrent and I don’t have any qualms with this application of this technology being used.
However, I’m quite surprised this is being built, considering that Apple:
- Has previously rejected the creation of a “back door” to encryption, even to the Obama-era FBI in relation to the San Bernandino shooting. It’s hard for me to see this as anything other than a back door.
- Has taken the stance that privacy is a “fundamental human right”.
- Must follow the law in each country where it operates (aka, basically the whole world, including China).
It’s hard for me to not see this as Pandora’s box: the database of comparison is universally accepted as disgusting and should not exist. However, what if the definition of what’s deemed unacceptable broadens?
What happens then if an authoritarian government asks to leverage this backdoor to aid their flushing out of dissidents? Or to further curtail human rights? (See the #1 from this Monday 6.)
The Electronic Frontier Foundation warns
along these lines, emphasis mine:
All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.
What do you think? It’s hard to fault Apple’s narrow motives here, but are the costs incurred worth it?
I’ve posted this as a poll on my LinkedIn
and would be interested to hear your take.