Apple is being sued over its decision not to implement a system that would scan iCloud photos for child exploitation content (CSAM). The lawsuit alleges that victims were forced to relive their trauma because Apple did not do more to prevent the spread of this material. Apple announced this system in 2021, but stopped implementing these plans due to concerns from security and privacy advocates that it could create a backdoor for government surveillance.
In 2021, Apple announced a system that will detect known CSAM content in users’ iCloud libraries after digital signatures collected from the National Center for Missing and Exploited Children (NCMEC) and other groups. But it shelved those plans over concerns from security and privacy advocates that the technology could create a backdoor for government surveillance.
The case was filed by a 27-year-old woman who was abused as a baby by a relative and whose images were shared online. The woman stated that she still receives notifications from law enforcement almost every day that someone who possesses these images has been charged. Attorney James Marsh said there are 2,680 potential victims who may be eligible for compensation in this case.
In a statement to TechCrunch, Apple stated that they continue to innovate to combat these crimes without compromising security and privacy. In August, a 9-year-old girl and her guardian filed a lawsuit against Apple, accusing it of failing to address CSAM in iCloud.