The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material in 2025, with “the vast majority” stemming from Amazon.
The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material in 2025, with “the vast majority” stemming from Amazon.
If it knows what children looks like and knows what sex looks like, it can extrapolate. That being said, I think all photos of children should be removed from the datasets, regardless of the sexual content, because of this.
Obligatory it doesn’t “know” what anything looks like.
Thank you, I almost forgot. I was busy explaining to someone else how their phone isn’t actually smart.