Apple is delaying its child safety features
Apple says it’s delaying the rollout of Child Sexual Abuse Material (CSAM) detection tools “to make improvements” following pushback from critics. The features include one that analyzes iCloud Photos for known CSAM, which has caused concern among privacy advocates.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple told 9to5Mac in a statement. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple planned to roll out the CSAM detection systems as part of upcoming OS updates, namely iOS 15, iPadOS 15 and macOS Monterey. The company is expected to release those in the coming weeks. Apple didn’t go into detail about the improvements it might make. Engadget has contacted the company for comment.
Developing…
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.