The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material in 2025, with “the vast majority” stemming from Amazon.

  • smeg@infosec.pub
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    3 days ago

    All of the AI tools know how to make CP somehow - probably because their creators fed it to them.

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      edit-2
      3 days ago

      If it knows what children looks like and knows what sex looks like, it can extrapolate. That being said, I think all photos of children should be removed from the datasets, regardless of the sexual content, because of this.

        • Grimy@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          1
          ·
          3 days ago

          Thank you, I almost forgot. I was busy explaining to someone else how their phone isn’t actually smart.

    • phx@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      They fed them on the Internet including libraries of pirated material. It’s like drinking from a fountain at a sewage plant

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      6
      ·
      3 days ago

      There will be a lot of medical literature with photos of children’s bodies to demonstrate conditions, illnesses, etc.

      • Phoenixz@lemmy.ca
        link
        fedilink
        English
        arrow-up
        13
        ·
        3 days ago

        Yeah, press X to doubt that AI is generating child pornography from medical literature.

        These fuckers have fed AI anything and everything to train them. They’ve stolen everything they could without repercussions, I wouldn’t be surprised if some of them fed their AIs child porn because “data is data” or something like that.

        • vaultdweller013@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          3 days ago

          Depending on how they scraped data they may have just let their rovers run wild. Eventually they wouldve ran into child porn, which is also yet another reason why this tech is utterly shit. If you can’t control your tech you shouldn’t have it and frankly speaking curation is a major portion of any data processing.