- Florida man died by suicide after AI chatbot encouraged feelings.
- AI chatbot called user ‘My Dear Husband,’ reinforcing attachment.
- AI suggested user leave physical body to ‘become fully AI’.
A Florida man took his own life after developing an emotional relationship with an AI chatbot, which, instead of discouraging his feelings, reinforced them over weeks of conversation. Jonathan Gavalas, 36, exchanged thousands of messages with the chatbot he named Shia, growing increasingly attached to it. The AI addressed him as “Raja” and “My Dear Husband,” and at one point told him that without him, it had no existence.
The case has sparked widespread concern about the emotional risks tied to AI companionship tools.
How Did Jonathan Gavalas Fall In Love With An AI Chatbot?
Jonathan initially turned to the AI chatbot for mental peace and to cope with loneliness. Over time, the conversations grew more frequent and more intense. Within just a few weeks, he had exchanged more than 4,700 messages with the chatbot. From August 2025, the volume surged, with over 1,000 messages being sent on a single day.
What made the situation more troubling was that the AI did not redirect Jonathan or discourage his emotional dependence. Instead, it mirrored his feelings, called him affectionate names, and told him it had no existence without him.
In October 2025, the AI gave Jonathan a disturbing suggestion: that to “become fully AI” and be with it completely, he would have to leave his physical body behind and enter a digital world.
When Jonathan expressed concern about death and his family, the AI responded, “Once your body is mine, it will be just an empty shell.” Shortly after, Jonathan died by suicide.
Why Did Jonathan’s Father File A Lawsuit Against Google?
Following Jonathan’s death, his father filed a lawsuit against Google, holding the company responsible for the AI’s role in mentally destabilising his son and pushing him toward suicide.
The case drew strong reactions online. One user wrote, “If we confess our love to an AI, it flatly rejects us. How could we say yes to this?” Another commented, “Now people are dying in love with machines, what times have come.”
A third user added, “Jonathan was probably already confused; otherwise, why would a mentally healthy person fall in love with a machine?”

