<span style=”font-weight: 400;”>Apple</span> has postponed its plans to scan devices for child abuse and exploitation data after the tool provoked a backlash from users and privacy advocates.
The new security features would have been included in iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.
Apple planned to introduce a number of features to increase security and user safety. These included monitoring messages to scan and alert when sexually explicit images were sent, as well as changes to Siri and Search to warn parents and children when they were in “unsafe” situations, including intervening if a search for Child Sexual Abuse Material (CSAM) was conducted.
The most controversial feature was a CSAM scanning tool designed to “protect children from predators who use communication tools to recruit and exploit them. The tool would use cryptography “to help limit the spread of CSAM online” while respecting users “privacy.”
The scanner was immediately the subject of controversy online, and lead to criticism from privacy advocates and cryptography experts.
Shortly thereafter, Apple chose to postpone the rollout so the tech giant could take “additional time” to analyze the tools and their potential impact.
For more information, view the original story from ZDNet.