Apple, the company that proudly touted its user privacy bona fides in its recent iOS 15 preview, recently introduced a feature that seems to run count

The controversy over Apple’s plan to protect kids by scanning your iPhone

submited by
Style Pass
2021-08-10 17:30:05

Apple, the company that proudly touted its user privacy bona fides in its recent iOS 15 preview, recently introduced a feature that seems to run counter to its privacy-first ethos: the ability to scan iPhone photos and alert the authorities if any of them contain child sexual abuse material (CSAM). While fighting against child sexual abuse is objectively a good thing, privacy experts aren’t thrilled about how Apple is choosing to do it.

Apple’s new “expanded protections for children” might not be as bad as it seems if the company keeps its promises. But it’s also yet another reminder that we don’t own our data or devices, even the ones we physically possess. You can buy an iPhone for a considerable sum, take a photo with it, and put it in your pocket. And then Apple can figuratively reach into that pocket and into that iPhone to make sure your photo is legal.

Last week, Apple announced that the new technology to scan photos for CSAM will be installed on users’ devices with the upcoming iOS 15 and macOS Monterey updates. Scanning images for CSAM isn’t a new thing — Facebook and Google have been scanning images uploaded to their platforms for years — and Apple is already able to access photos uploaded to iCloud accounts. Scanning photos uploaded to iCloud in order to spot CSAM would make sense and be consistent with Apple’s competitors.

Leave a Comment