Apple To Only Seek Abuse Photos Flagged In Many Countries

Share post:

Apple announced on Friday that it would only look for images of child sexual abuse identified by clearinghouses in different countries.

Apple also dismissed criticism that the new system could be used to target individuals, stating that researchers are making sure the list of image identifiers on an iPhone matches the lists on all other phones.

On Friday, Apple executives noted that in resolving questions about how many matched images on a phone or computer it will take before the operating system alerts the company for a human review it could begin with 30 images, which could then be lowered over time with an improvement in the functioning of the system.

Apple acknowledged that it had mishandled communications around the feature, but declined to comment on the possibility that the criticism might have altered any of its policies or software.

For more information, read the original story in Reuters.

SUBSCRIBE NOW

Related articles

Costs from Global CrowdStrike Outage Could Exceed $1 Billion

The global tech outage caused by a faulty CrowdStrike update on Friday could result in damages exceeding $1...

Kaspersky to shut down its US business due to sanctions

Russian cybersecurity firm Kaspersky Lab announced it will cease its U.S. operations starting July 20, following sanctions from...

Intuit lays off 1,800 people amid a shift to AI

Intuit, the company behind QuickBooks, Credit Karma, and TurboTax, is laying off 1,800 employees, which is about 10%...

VMWare revenue drops by $600 million but Broadcom assures investors growth plan is on track

In its first full quarter under Broadcom's ownership, VMware's revenue fell by $600 million, dropping to $2.7 billion....

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways