Site icon Tech Newsday

Apple Defends New Child Abuse Detection Technology

Apple has defended its new technology, which searches user’s phones for child sexual abuse material (CSAM) after it was heavily criticized by customers and privacy campaigners because the technology could be a “backdoor” to spy on people.

Some digital privacy advocates warned last week that authoritarian governments could use the technology to support anti-LGBT regimes or crackdown on political dissidents in countries where protests are considered illegal.

Apple said that the company will not “expand” the system, but said it had already put in place various security measures to ensure that the technology would not be used to address issues other than the detection of child abuse images.

Apple went on to explain that it will only scan photos shared on iCloud and that its anti-CSAM tool will prevent the company from seeing or even scanning a user’s photo album.

Since the system relies on a database of hashes of known CSAM images provided by child protection organizations, Apple explained that it is almost impossible to falsely flag innocent people to police, as there is also a human verification of positive matches that the system identifies.

For more information, read the original story on the BBC.

Exit mobile version