• kungen@feddit.nu
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 days ago

    I think it’s disgusting that X probably doesn’t see a problem with it, but it still wouldn’t be legally classified as CSAM, no?

    • ILikeTraaaains@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      In some places it is CSAM and in others it is being working into convert into law.

      I think the issues are:

      • It can pass as real
      • Unlike run-of-the-mill cartoon porn, real photos of children (even not CSAM) are involved, either in the training or as input for the generation