Apple removes mentions of controversial child abuse scans from its site


Apple has hinted that it won’t be resurrecting its controversial attempt to scan for CSAM (child sexual abuse material) photos anytime soon. MacRumors notes that Apple has removed all mentions of the scan feature on its Child Safety website. Visit now and you’ll only see iOS 15.2’s optional nude photo detection in Messages and Intervention when people search for child exploitation terms.

It’s not sure why Apple pulled the credentials. We have asked the company for comment. This doesn’t necessarily mean a complete pullback from CSAM scanning, but it at least suggests that a rollout isn’t imminent.

The CSAM detection feature drew flak from privacy advocates because it would flag scans on the device that could be sent to law enforcement. While Apple emphasized the existence of multiple safeguards, such as a high threshold for flags and its reliance on hashes from private organizations, there were concerns that the company would still produce false positives or expand scanning under pressure from authoritarian governments.

Apple has indefinitely postponed the rollout to “make improvements”. However, it is now clear that the company is in no rush to make those changes, nor does it want to create conflicting expectations.

All products recommended by Engadget have been selected by our editorial team, independent of our parent company. Some of our stories contain affiliate links. If you buy something through one of these links, we may earn an affiliate commission.


Stay tuned for more such real estate news and updates at zavalinka.in

Leave a Comment