Aralık 9, 2024
Apple is being sued over its decision not to implement a system that would scan iCloud photos for child exploitation content (CSAM). The lawsuit alleges that victims were forced to relive their trauma because Apple did not do more to prevent the spread of this material. Apple announced this system in 2021, but stopped implementing […]