Business & Tech

Apple Drops Controversial Plan To Check IOS Devices, ICloud Photos For Child Abuse Imagery

Apple's controversial tool that would check iPhones, iPads and iCloud photos for child sexual abuse material (CSAM).

(CBS)

December 9, 2022

Apple is abandoning its plans to launch a controversial tool that would check iPhones, iPads and iCloud photos for child sexual abuse material (CSAM) following backlash from critics who decried the feature's potential privacy implications.

Find out what's happening in Cupertinofor free with the latest updates from Patch.

Apple first announced the feature in 2021, with the goal of helping combat child exploitation and promoting safety, issues the tech community has increasingly embraced. But it soon put the brakes on implementing the feature amid a wave of criticism, noting it would "take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

In a public statement Wednesday, Apple said it had "decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos."

Find out what's happening in Cupertinofor free with the latest updates from Patch.


CBS Local Digital Media personalizes the global reach of CBS-owned and operated television and radio stations with a local perspective.

More from Cupertino