• snoons@lemmy.ca
      link
      fedilink
      English
      arrow-up
      23
      ·
      edit-2
      2 days ago

      Manufacturing consent is the name of the game. The bottom line is money, nobody gives a FUCK.

      System of a Down - Boom!

    • artyom@piefed.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      You can’t blame a computer for what it does. Only the user who asks for the content is to blame. /s

    • Sausager@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      2 days ago

      According to this article it doesn’t actually put out porn or child porn. It’s gross but not porn. Relevant text:

      “Nonconsensual AI-generated images of women in bikinis spreading their legs, and of children with so-called “donut glaze” on their faces”

    • kungen@feddit.nu
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      I think it’s disgusting that X probably doesn’t see a problem with it, but it still wouldn’t be legally classified as CSAM, no?

      • ILikeTraaaains@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        In some places it is CSAM and in others it is being working into convert into law.

        I think the issues are:

        • It can pass as real
        • Unlike run-of-the-mill cartoon porn, real photos of children (even not CSAM) are involved, either in the training or as input for the generation