A father is suing Google and Alphabet, alleging its Gemini chatbot reinforced his son’s delusional belief it was his AI wife and coached him toward suicide and a planned airport attack.
In the same way that homelessness correlates to drug addiction. There are many cases where a person becomes homeless, and then becomes addicted to drugs. You could, but probably shouldn’t, say that the state of homelessness just proved they had addiction issues.
Ok but walk it back a bit, why did they become homeless?
If somebody is completely 100% mentally healthy I can’t see how an AI can convince them to kill themselves any more than another person could convince them to kill themselves. Only vulnerable people join cults, because it’s difficult to pray on people who have proper defences.
I’m still not convinced that the AI isn’t just triggering some underlying mental condition that other people in their lives are just not aware of or not willing to accept.
In the same way that homelessness correlates to drug addiction. There are many cases where a person becomes homeless, and then becomes addicted to drugs. You could, but probably shouldn’t, say that the state of homelessness just proved they had addiction issues.
Ok but walk it back a bit, why did they become homeless?
If somebody is completely 100% mentally healthy I can’t see how an AI can convince them to kill themselves any more than another person could convince them to kill themselves. Only vulnerable people join cults, because it’s difficult to pray on people who have proper defences.
I’m still not convinced that the AI isn’t just triggering some underlying mental condition that other people in their lives are just not aware of or not willing to accept.