[ad_1]
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM).
By not doing more to prevent the spread of this material, the lawsuit argues, it forces victims to relive the trauma they have experienced. According to the New York Times. The suit describes Apple as announcing a “widely promoted improved design intended to protect children” and then failing to “implement those designs or take any measures to detect and reduce these materials.”
Apple first announced the system in 2021, explaining that it would use digital signatures from the National Center for Missing and Exploited Children and other groups to detect known CSAM content in users’ iCloud libraries. However, it appears to have abandoned those plans after security and privacy advocates pointed out that it could create a backdoor to government surveillance.
The lawsuit reportedly came from a 27-year-old woman who is suing Apple under a pseudonym. She said a relative molested her when she was a child and shared photos of her online, and that she still receives law enforcement notices almost every day about someone being accused of possessing those photos.
Attorney James Marsh, who is involved in the lawsuit, said there is a potential group of 2,680 victims who could be entitled to compensation in this case.
TechCrunch has reached out to Apple for comment. A company spokesperson told The Times that the company is “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”
In August, A 9-year-old girl and her guardian filed a lawsuit against AppleHe accused the company of failing to address CSAM on iCloud.
[ad_2]