• brsrklf@jlai.lu
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 days ago

    Holy shit, I thought it would just be another story of the assistant answering a “Tell me how to die” request (and it did, and it’s terrible enough), but there’s even worse.

    The part where the kid says he’d want to be stopped and the assistant tells him he should hide better to make sure nobody can.

    • FreedomAdvocate
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      10
      ·
      2 days ago

      He has told it that he was writing a story so that all of this was for the story. He didn’t get anything from ChatGPT that he couldn’t have gotten from a search engine or a chat room or Reddit.

      He was mentally ill, his feelings were affirmed, and he made a stupid decision that he was clearly in no mental state to make, and it ended up with severe consequences. Hopefully some people learn some lessons from that.