Child protection experts from across the world have called on Apple to implement new scanning technologies urgently to detect images of child abuse. I

Child abuse: Apple urged to roll out image-scanning tool swiftly

submited by
Style Pass
2021-09-17 16:00:09

Child protection experts from across the world have called on Apple to implement new scanning technologies urgently to detect images of child abuse.

In August, Apple announced plans to use a tool called neuralMatch to scan photos being uploaded to iCloud online storage and compare them to a database of known images of child abuse.

However, the tech company has since said it is pausing the rollout after heavy lobbying from privacy campaigners who raised concerns over the potential misuse of neuralMatch by governments that they claim could use the tool to increase surveillance of private citizens.

Ross Anderson, a professor of security engineering at Cambridge University and Edinburgh University, wrote: “Child protection online is an urgent problem, but this proposal will do little to prevent these appalling crimes, while opening the floodgates to a significant expansion of the surveillance state.”

This week, child protection agencies, including the NSPCC, the National Center for Missing and Exploited Children (NCMEC) and the UN special rapporteur on the sale and sexual exploitation of children, released a joint statement endorsing neuralMatch and saying that “time is of the essence” to use new technology to help protect children from online exploitation and abuse.

Leave a Comment