• Sahwa@reddthat.comOP
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    2 days ago

    This could happen to anyone including people without having mental issues, simply by having long conversations with AI.

    On 7 August, Kate Fox received a phone call that upended her life. A medical examiner said that her husband, Joe Ceccanti – who had been missing for several hours – had jumped from a railway overpass and died. He was 48.

    Fox couldn’t believe it. Ceccanti had no history of depression, she said, nor was he suicidal – he was the “most hopeful person” she had ever known. In fact, according to the witness accounts shared with Fox later, just before Ceccanti jumped, he smiled and yelled: “I’m great!” to the rail yard attendants below when they asked him if he was OK.

    Her husband wanted to use ChatGPT to create sustainable housing. Then it took over his life.

      • XLE@piefed.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        In the same way that homelessness correlates to drug addiction. There are many cases where a person becomes homeless, and then becomes addicted to drugs. You could, but probably shouldn’t, say that the state of homelessness just proved they had addiction issues.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          Ok but walk it back a bit, why did they become homeless?

          If somebody is completely 100% mentally healthy I can’t see how an AI can convince them to kill themselves any more than another person could convince them to kill themselves. Only vulnerable people join cults, because it’s difficult to pray on people who have proper defences.

          I’m still not convinced that the AI isn’t just triggering some underlying mental condition that other people in their lives are just not aware of or not willing to accept.