Apple removes mentions of controversial child abuse scanning from its site


Apple has hinted it might not revive its controversial effort to scan for CSAM (child sexual abuse material) photos any time soon. MacRumors notes Apple has removed all mentions of the scanning feature on its Child Safety website. Visit now and you’ll only see iOS 15.2’s optional nude photo detection in Messages and intervention when people search for child exploitation terms.

It’s not certain why Apple has pulled the references. We’ve asked the company for comment. This doesn’t necessarily represent a full retreat from CSAM scanning, but it at least suggests a rollout isn’t imminent.

The CSAM detection feature drew flak from privacy advocates as it would flag on-device scans that could be sent to law enforcement. While Apple stressed the existence of multiple safeguards, such as a high threshold for flags and its reliance on hashes from private organizations, there were concerns the company might still produce false positives or expand scanning under pressure from authoritarian governments. 

Apple delayed the rollout indefinitely to “make improvements.” However, it’s now clear the company isn’t in a rush to complete those changes, and doesn’t want to set expectations to the contrary.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.



Source link

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *