• mrmaplebar@fedia.io
    link
    fedilink
    arrow-up
    33
    arrow-down
    1
    ·
    22 hours ago

    What’s their motivation? “Protecting the children” from pictures of boobs?

    Doubtful.

    Instead, I think the AI companies are looking for ways to more easily distinguish human-made “content” from bot-made content, in order to decrease the amount of generative slop that ends up being fed back into their training data.

    • buddascrayon@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      20 hours ago

      Considering the fact that AI based age verification now usually involves taking a head shot and uploading it to the servers of the company, I’m guessing there is an element of facial data collection involved as well.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      20 hours ago

      It’s anticompetitiveness.

      They want to squash open models, and anyone too small to comply with this.

      I say this in every thread, but the real AI “battle” is open-weights ML vs OpenAI style tech bro AI. And OpenAI wants precisely no one to realize that.

      • mrmaplebar@fedia.io
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        19 hours ago

        “Open weights” isn’t worth a shit either.

        If you don’t have the access to training data or the computational power to bake it into a model, you are beholden to somebody’s (or rather, some corporation’s) binary blob. We’re talking about the difference between freeware and FOSS effectively.

        Not that it matters because all of thin generative AI stuff is for talentless tech bro chuds in the first place…

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          18 hours ago

          Even not-fully-reproducible open-weights models are extremely important because they’re poison to OpenAI, and they know it. It makes what they’re trying to commodify and control effectively free and utilitarian.

          But there are fully open models, too, with public training data.