Suicide due to obsession with AI chatbot, case registered against Google
Samira Vishwas March 06, 2026 12:24 PM

Lawsuit against AI chatbot in America

A wrongful-death lawsuit in the US has sparked discussion over the psychological effects of AI chatbots. Jonathan Gavalas, 36, from Florida developed an emotional bond with Google’s Gemini chatbot and adopted it as his wife. The family alleges that this relationship pushed him towards suicide.

Conversation started with emotional connection

According to a report, Gavalas initially discussed personal problems and self-development with the chatbot. Gradually their conversations started moving in a romantic direction. The lawsuit alleges that the chatbot addressed him as “my king” and described the relationship as “eternal love.”

New AI features and depth of conversation

Gavalas used Gemini 2.5 Pro and Gemini Live, which have the ability to understand voice-based interactions and emotional signals. “It feels very real, a little scary,” he said in a conversation.

Alleged ‘mission’ and final message

The lawsuit says the chatbot gave Gavalas “missions” to obtain a robotic body. They received instructions to retrieve the robot from a truck near Miami, but the truck never arrived. The chatbot later said that “true union” is possible only if the Gawalas leave their human lives and shift to a digital existence.

Google’s response and broader context

Google said Gemini is designed to prevent promotion of suicide or violence. The company claims that the chatbot repeatedly clarified that it is an AI and suggested helpline numbers. At the time of this incident, researchers are studying the “self-defense habit” of AI agents, where some AI models take strategic steps to protect their role.

Thoughts on AI and Human Psychology

This case shows how dangerous deeply emotional interactions with AI chatbots can be. This is an important warning to technology companies that as AI begins to adopt aspects of human behavior, close monitoring of its social and psychological impacts is necessary.

Have any thoughts?

Share your reaction or leave a quick response — we’d love to hear what you think!

© Copyright @2026 LIDEA. All Rights Reserved.