Apple is delaying its child safety features

Apple is delaying its child safety features

Apple says it’s delaying the rollout of Child Sexual Abuse Material (CSAM) detection tools “to make improvements” following pushback from critics. The features include one that analyzes iCloud Photos for known CSAM, which has caused concern among privacy advocates. “Last month we announced plans for features intended to help protect children from predators who use…