Apple To Only Seek Abuse Photos Flagged In Many Countries

Share post:

Apple announced on Friday that it would only look for images of child sexual abuse identified by clearinghouses in different countries.

Apple also dismissed criticism that the new system could be used to target individuals, stating that researchers are making sure the list of image identifiers on an iPhone matches the lists on all other phones.

On Friday, Apple executives noted that in resolving questions about how many matched images on a phone or computer it will take before the operating system alerts the company for a human review it could begin with 30 images, which could then be lowered over time with an improvement in the functioning of the system.

Apple acknowledged that it had mishandled communications around the feature, but declined to comment on the possibility that the criticism might have altered any of its policies or software.

For more information, read the original story in Reuters.

Featured Tech Jobs

SUBSCRIBE NOW

Related articles

FTC says Microsoft’s layoffs at Activision Blizzard may threaten merger approval

The FTC has expressed dissatisfaction with Microsoft's layoffs at Activision Blizzard, challenging the integrity of the Microsoft-Activision deal....

Delaware court voids Musks $56 billion dollar compensation

Tesla's stock experienced a notable downturn following a Delaware court's decision to void CEO Elon Musk's massive $56...

IT World Canada strikes partnership with Canadian Cybersecurity Network

Goal is to make it easier for infosec pros to access each organization

Microsoft overtakes Apple as world’s most valuable company

In a notable shift in the tech industry, Microsoft has recently overtaken Apple to become the world's most...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways