Police in 20 countries have taken down a network that distributed images of child sexual abuse entirely generated by artificial intelligence.
The operation – which spanned European countries including the United Kingdom as well as Canada, Australia, New Zealand – is “one of the first cases” involving AI-generated images of child sexual abuse material, Europe’s law enforcement agency Europol, which supported the action, said in a press release.
Danish authorities led an operation, which resulted in 25 arrests, 33 house searches and 173 devices being seized.
How did they train the model? I’d say it’s just as problematic if the generator was trained using CSAM.
How can anyone believe these models have a big pile of hyper-illegal go-to-jail images, labeled specifically for word-to-image training?
This technology combines concepts. That’s why it’s a big fucking deal. Generating a thousand images of Darth Shrektopus riding a horse on the moon does not imply a single example of exactly that. The model knows what children are - the model knows what pornography is. It can satisfy those unrelated concepts, simultaneously.