Site icon Tech Newsday

Apple Facing Criticism For Child Abuse Detection System

Apple is facing backlash over a new system that finds child sexual abuse material (CSAM) on devices belonging to U.S. users.

The technology searches for matches with known CSAM before storing the image on iCloud photos. However, there are concerns that the technology could be expanded and used by authoritarian governments to spy on their own citizens.

WhatsApp chief executive Will Cathcart called Apple’s move “very concerning.”

Apple announced that new versions of iOS and iPadOS – to be released later this year – will have “new applications of cryptography to help stop the spread of CSAM online while designing for user privacy.” The system will report a match which will then be manually checked by an employee, which can then lead to steps to disable the user account and report it to the authorities.

The tech giant says the new technology offers “significant” privacy benefits over existing techniques – as Apple only learns about users’ photos if they have a collection of confirmed child abuse images in their iCloud account.

But WhatsApp’s Cathcart says the system “could very easily be used to scan private content for anything they or a government decides it wants to control. Countries, where iPhones are sold, will have different definitions on what is acceptable.”

Still, some politicians have welcomed Apple’s development.

The U.K. Health Secretary Sajid Javid said it was time for others, particularly Facebook, to follow suit, and U.S. Senator Richard Blumenthal hailed Apple’s move as a “welcome, innovative, and bold step.”

For more information, read the original story in the BBC.

Exit mobile version