• Kalothar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    edit-2
    3 days ago

    I know this is rather serious, however due to the thumbnail, I thought Maya Gebala was the species of bird that she was holding until the end of the sentence

  • goatinspace@feddit.org
    link
    fedilink
    English
    arrow-up
    17
    ·
    3 days ago

    The premier said Altman agreed to apologize to the community of Tumbler Ridge. An apology has not yet been issued.

    • criss_cross@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      “I’m sorry you feel that way about AI and can’t handle the future. Here’s some free ChatGPT credits”

    • XLE@piefed.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      This part is at the bottom of the article, but Altman’s promise to say sorry got an article of its own:

      Related Stories:
      OpenAI CEO agreed to apologize to Tumbler Ridge community, says B.C. premier

      And it quotes the premier singing the praises of Altman and AI:

      Eby credited Altman for participating in the call, acknowledging he was not obligated to do so, and suggested that, based on a review by the premier’s staff, OpenAI has better reporting standards than any similar companies operating in Canada…

      Eby called AI a technology with “incredible promise,” including in providing medical care and tackling issues such as climate change.

      The last part is by far the dumbest. Medical care is grimly ironic when Canada has a doctor-assisted suicide problem, and climate change is something AI is accelerating. It’s not going to generate a novel solution to it.

  • FlashMobOfOne@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    3 days ago

    LLM’s have been proven to manipulate their users into doing terrible things. That’s what happens when there isn’t any meaningful regulation of a harmful product.

    OpenAI will settle, NDA’s will be signed, and people will forget this ever happened.

  • BananaIsABerry@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    17
    ·
    3 days ago

    Is she suing the gun manufacturer too?

    How about the shoe manufacturer for providing the means to walk easier?

    These kinds of lawsuits are so incredibly stupid.

    • Arcadeep@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      3 days ago

      Did the gun convince the guy that it was a good idea to shoot people, or collaborate? Did the shoes give him ideas or tips on how to do it?

      • BananaIsABerry@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        8
        ·
        3 days ago

        Would it be valid, then, to say that a search engine is responsible when someone searches how to do a crime?

        How about a forum where people talk about the subject, even if they themselves weren’t going to participate in the crimes?

        The chatbot is just another avenue to finding information you want to find.

        I did read into the article and apparently they’re suing because OpenAI had the account flagged as a potential harm to self or others, but they had already banned the original account. What more do you want them to do? Report them to the thought police?

        • XLE@piefed.social
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          edit-2
          3 days ago

          If somebody on a forum was helping to plot ways to commit a crime, that person should probably be at least questioned. OpenAI’s chatbot is that “somebody” in this case.

          • iegod@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            8
            ·
            2 days ago

            False equivalence. Tools are not people. We going after magic 8 balls too?

            • XLE@piefed.social
              link
              fedilink
              English
              arrow-up
              6
              ·
              edit-2
              2 days ago

              Come on, don’t be so dishonest. Compare similar things. This “tool” is designed to create humanlike realtime communication, and it’s run by a billionaire rapist who just as easily have groomed the killer himself (thanks to it being a black box “live service”, we don’t know where the grooming came from, do we).

              I remember your previous comment from another thread:

              Vulnerable people don’t get to outsource responsibility.

              But apparently billionaires do.

              • iegod@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                5
                ·
                2 days ago

                The tool isn’t sentient, it operates on logical weights, and provides output that mimics its training set. LLMs are pretty impressive at what they can output, but it would be dishonest to attribute human qualities to it. There are decades of implementations of various AI techniques to varying degrees in attempts to achieve the same. It is on the technical basis, and the technical basis alone, that we should be carefully considering legal constraints.

                How much a CEO is worth, how trustworthy they are, what cirlces they run in, shouldn’t be part of that consideration.

                That doesn’t mean I think Altman isn’t a turd who can suck a fat one.

                • XLE@piefed.social
                  link
                  fedilink
                  English
                  arrow-up
                  6
                  ·
                  2 days ago

                  Like I said, it is built to be human_like_. Of course it’s not human or sentient, but Sam Altman sells ChatGPT with humanizing language, describes human attributes, and personally subsidized the grooming of people to commit suicide and homicide.