Police in 20 countries have taken down a network that distributed images of child sexual abuse entirely generated by artificial intelligence.

The operation – which spanned European countries including the United Kingdom as well as Canada, Australia, New Zealand – is “one of the first cases” involving AI-generated images of child sexual abuse material, Europe’s law enforcement agency Europol, which supported the action, said in a press release.

Danish authorities led an operation, which resulted in 25 arrests, 33 house searches and 173 devices being seized.

  • mindbleach@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    You can’t… generate… abuse.

    No more than you can generate murder.

    The entire point of saying “child abuse images” is to distinguish evidence of rape from, just, drawings.

    If you want drawings of this to also be illegal, fine, great, say that. But stop letting people use the language of actual real-world molestation of living human children, when describing some shit a guy made up alone.

      • mindbleach@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        2 months ago

        How can anyone believe these models have a big pile of hyper-illegal go-to-jail images, labeled specifically for word-to-image training?

        This technology combines concepts. That’s why it’s a big fucking deal. Generating a thousand images of Darth Shrektopus riding a horse on the moon does not imply a single example of exactly that. The model knows what children are - the model knows what pornography is. It can satisfy those unrelated concepts, simultaneously.