Apple, CSAM and iCloud
Digest more
Apple originally planned to carry out on-device scanning for CSAM, using a digital fingerprinting technique. These fingerprints are a way to match particular images without anyone having to view them, and are designed to be sufficiently fuzzy to continue ...
Content warning: This article contains information about child sexual abuse. Reader discretion is advised. Report child sexual abuse to local law enforcement and contact the DCFS 24/7 hotline: 855-323-3237. For more information, visit dcfs.utah.gov.
Two major developments reignited regulatory and technological discourse around Child Sexual Abuse Material (CSAM) this year: The first, Visa & MasterCard cracking down on adult sites that contained CSAM content. The second, the realization that CSAM is far ...
ST. FRANCOIS COUNTY, Mo. — State troopers arrested two Missouri men for having or promoting child sexual abuse materials thanks to cyber tips from the National Center for Missing and Exploited Children (NCMEC). Arrested in two separate investigations ...
A UK man who used AI to create child sexual abuse material (CSAM) has been sentenced to 18 years in prison, according to The Guardian. Hugh Nelson, 27, created the images by using photographs of real children, which were then manipulated by AI. Nelson was ...
Over a year ago, Apple announced plans to scan for child sexual abuse material (CSAM) with the iOS 15.2 release. The technology is inevitable despite imperfections and silence about it. And then, the complaints started. And, they started, seemingly ...
STE. GENEVIEVE COUNTY, Mo. — A Farmington kindergarten teacher is facing 10 charges in connection with child sexual abuse material shared on a social messaging app. On Wednesday, 36-year-old Erika Morton was charged with 5 counts of promoting CSAM and ...