As digital rules on detecting child abuse material expire and lawmakers clash over new powers, tech companies keep monitoring ...
A Massachusetts man was found guilty of producing and distributing child sex abuse material and sentenced to up to five years ...
The rapid advancement of artificial intelligence has made it easier than ever for bad actors to create child sexual abuse material, leaving prosecutors and lawmakers struggling to keep up. Subscribe ...
Add Yahoo as a preferred source to see more of our stories on Google. ST. LOUIS – A 40-year-old St. Louis County man was sentenced in court Wednesday for possessing child pornography. According to ...
EVERETT, Wash. — A Washington state man has been arrested on charges of threatening to murder a Massachusetts minor and distributing child sexual abuse material (CSAM) and videos of animals being ...
BLOOMINGTON, Ind. — An Indiana University student in Bloomington was arrested after he was found with multiple images of child sex abuse material in his direct messages on X, the social media platform ...
As Testut and Shane explain: As you may have heard, over the last few weeks X and Grok have made it possible for child sexual abuse material (CSAM) to be generated and widely distributed on their apps ...
Apple is facing a lawsuit (PDF) filed Thursday by West Virginia Attorney General JB McCuskey over allegations that iCloud is being used to store and distribute child sexual abuse material online.
An international law enforcement action called Operation Alice has shut down over 373,000 dark web sites that offered fake CSAM packages. The investigation, led by Germany and supported by Europol, ...
The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material (CSAM) in 2025. The "vast majority" of that content was ...
It seems that instead of updating Grok to prevent outputs of sexualized images of minors, X is planning to purge users generating content that the platform deems illegal, including Grok-generated ...
A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As ...