Apple is being sued in the US for its decision not to implement a system that scanned iCloud photos for child sexual abuse material (CSAM).

The company announced the feature back in 2021, but never launched, and “does not take any measures to detect or restrict” this material. The lawsuit was filed by a 27-year-old woman. She said she was molested by relatives when she was a newborn and he posted photos of her online. The girl receives notifications from authorities almost every day that someone is accused of possessing these images.

Lawyer James Marsh, who addressed the lawsuit, said there was a potential group of 2,680 victims who could be entitled to the camera in this case.

Apple said it is “urgently and aggressively innovating to combat viruses without compromising the security and privacy of all users.” [The Verge]






Source: Iphones RU

Previous articleThe Ministry of Digital Development proposed to prohibit markets and banks from using foreign instant messaging
Next articleSkoda Elroq: a new electric SUV that will conquer the European market
I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.

LEAVE A REPLY

Please enter your comment!
Please enter your name here