Technology

Apple Sues Over Alleged CSAM in iCloud

Apple is facing a lawsuit (PDF) filed Thursday by West Virginia Attorney General JB McCuskey alleging that iCloud is used to store and distribute child sexual abuse material online. McCuskey says Apple knew about this “for years” and “chose to do nothing about it.”

The lawsuit contains screenshots of an iMessage between Apple executives Eric Friedman and Herve Sibert acknowledging the storage and distribution of CSAM on iCloud in February 2020.

“In an iMessage conversation about whether Apple is placing too much emphasis on privacy and not enough on the trust and safety of children, Friedman boasted that iCloud is a ‘big place to distribute child pornography’ and that Apple has ‘chosen to be ignorant in enough places where we can’t tell,'” the lawsuit says.

“In the same interview,” it continues, “Friedman referred to a New York Times article about the CSAM discovery and revealed that he suspects Apple is underreporting the extent of the CSAM problem it has in its products.”

The case points to the number of CSAM reports received made to the National Center for Missing and Exploited Children in 2023 by Apple (267), compared to Google (1.47 million) and Meta (30.6 million).

The lawsuit alleges that Apple failed to use tools to detect CSAM, including a proprietary scanning tool it had been working on. In 2021, Apple he started a campaign scanning the photos stored in iCloud for CSAM, which he had left the following year.

The role of nailing around the edges

It also points Apple Security offers Enhanced Data Protectionwhich happened available in December 2022 on iCloud and enables end-to-end encryption of photos and videos on a cloud storage platform. The lawsuit alleges that encryption is “an impediment to law enforcement, including the identification and prosecution of CSAM perpetrators and abusers.”

“Maintaining the privacy of child predators is absolutely inexcusable,” McCuskey said in a statement Thursday. “Since Apple has so far refused to take responsibility and do the right thing, I am filing this lawsuit to demand that Apple follow the law, report these images and stop abusing children by allowing these images to be stored and shared.”

Apple told CNET that “safety and privacy” are at the heart of its decisions, especially for children.

“We’re innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids,” Apple said Thursday. “All of our industry-leading parental controls and features, like Communication Safety — which automatically intervenes on children’s devices when nudity is detected in Messages, shared photos, AirDrop and even live FaceTime calls — are designed with the safety, security and privacy of our users in mind.”

I Communications Security the feature is automatically enabled for users under 18 years of age. We try to protect children from CSAM content, but we do not target adults who deal with the distribution and maintenance of CSAM.

The balance between privacy and security on the one hand, and law enforcement and cybercrime on the other, has been at the center of the debate about end-to-end encryption.

Privacy advocates such as the Electronic Frontier Foundation have applauded the introduction of iCloud encryption in 2022, noting that “regular scanning of child abuse images could lead to unnecessary investigations and false positives.” It pointed to the protection of people’s sensitive iCloud data, such as photos, against potential cloud data breaches and government demands.

“Banning the use of end-to-end encryption will have a negative impact and is against the security and privacy of everyone online,” EFF security and privacy activist Thorin Klosowski said in a statement. “Encryption is the best way we have to protect privacy online, especially for young people.”

Data breaches are on the riseas are government and law enforcement agency requests for user data for various reasons. You can see Apple’s transparent report on how many government requests it receives for user data, although it appears that it will end in December 2024.

Edge encryption is also used by Google for its messaging servicesand popular messaging apps like WhatsApp, Signal and Telegram.

The complaint was filed in the District Court of Mason County, West Virginia, on February 19.

Following a class action lawsuit was filed at the end of 2024 in the District Court of Northern California by 2,680 plaintiffs who claim that Apple’s software that stopped CSAM scanning amounts to the technology giant knowingly allowing its distribution and storage on iCloud. In August 2024, a similar lawsuit was filed on behalf of a 9-year-old sexual assault victim in North Carolina.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button