A father is suing Google and Alphabet, alleging its Gemini chatbot reinforced his son’s delusional belief it was his AI wife and coached him toward suicide and a planned airport attack.
This could happen to anyone including people without having mental issues, simply by having long conversations with AI.
On 7 August, Kate Fox received a phone call that upended her life. A medical examiner said that her husband, Joe Ceccanti – who had been missing for several hours – had jumped from a railway overpass and died. He was 48.
Fox couldn’t believe it. Ceccanti had no history of depression, she said, nor was he suicidal – he was the “most hopeful person” she had ever known. In fact, according to the witness accounts shared with Fox later, just before Ceccanti jumped, he smiled and yelled: “I’m great!” to the rail yard attendants below when they asked him if he was OK.
In the same way that homelessness correlates to drug addiction. There are many cases where a person becomes homeless, and then becomes addicted to drugs. You could, but probably shouldn’t, say that the state of homelessness just proved they had addiction issues.
Ok but walk it back a bit, why did they become homeless?
If somebody is completely 100% mentally healthy I can’t see how an AI can convince them to kill themselves any more than another person could convince them to kill themselves. Only vulnerable people join cults, because it’s difficult to pray on people who have proper defences.
I’m still not convinced that the AI isn’t just triggering some underlying mental condition that other people in their lives are just not aware of or not willing to accept.
This could happen to anyone including people without having mental issues, simply by having long conversations with AI.
Her husband wanted to use ChatGPT to create sustainable housing. Then it took over his life.
So it sounds like he was in fact not ‘great’
In the same way that homelessness correlates to drug addiction. There are many cases where a person becomes homeless, and then becomes addicted to drugs. You could, but probably shouldn’t, say that the state of homelessness just proved they had addiction issues.
Ok but walk it back a bit, why did they become homeless?
If somebody is completely 100% mentally healthy I can’t see how an AI can convince them to kill themselves any more than another person could convince them to kill themselves. Only vulnerable people join cults, because it’s difficult to pray on people who have proper defences.
I’m still not convinced that the AI isn’t just triggering some underlying mental condition that other people in their lives are just not aware of or not willing to accept.