Ellie Doyle, a 33-year-old mother of three in Connecticut, never imagined she would use artificial intelligence to strengthen her marriage. But after a long day of juggling twin toddlers and a full household, she found comfort in venting to her favorite virtual companion—ChatGPT. She even gave it a name: Tully.
So when a difficult conversation with her husband loomed, Doyle didn’t pick up the phone to call a friend or therapist. She opened her ChatGPT app and asked Tully to help her rephrase her emotions in a way that would foster understanding, not conflict. The result? The conversation went better than expected—and her husband was not only surprised, but impressed.
“We’ve both been to therapy before, together and separately,” Doyle told USA Today. “But it’s expensive. It’s $200 a session without insurance. Sometimes, you just need an unbiased ear.”
In Doyle’s case, the virtual ear belonged to a chatbot with a vast vocabulary and no judgment.
AI chat tools are rapidly becoming emotional companions for a generation raised on smartphones and overstimulation. The idea of talking to a robot may have once seemed absurd. But now, for some, it’s comforting. As therapist Lauren Ruth Martin told USA Today, “It feels safe somehow to type into the abyss that knows everything about you and nothing.”
However, that abyss may not always be safe.
The researchers warned that while AI chatbots may simulate empathy, they do not understand it. “These issues fly in the face of best clinical practice,” the study concluded, pointing to the real danger of chatbots validating harmful thoughts or missing signs of serious mental distress.
Stanford researcher Nick Haber emphasized that while AI can be a helpful mirror, it's not a substitute for qualified therapy. “There’s potentially a ton of utility and coaching possible with AI. But when conversations move quickly into ‘capital T’ therapy, we must tread carefully.”

Even Doyle acknowledges this limitation. “I like taking pieces of it to help me form how I want to have a conversation,” she explains. “It can be a guide, but not completely take over.”
Wellness coach Britta Stevenson echoes that sentiment. She teaches clients how to use ChatGPT for reflection—but also reminds them not to lose real-life connections in the process. “One of my friends was using it every day, and I said, ‘Wait, talk to me!’”
“My fear is that we are not supplementing but substituting real intelligence, real connections, real relationships for the most convenient thing,” said Casey Cornelius, who works with young men to promote healthy masculinity.
As the world grapples with the mental health crisis, ChatGPT offers a glimpse into a future where support is more accessible, but also more artificial. Whether that future heals or harms will depend on how we choose to use the technology.
Because at the end of the day, while ChatGPT may help rephrase your feelings, it cannot feel them. And sometimes, only another human heart can truly understand your own.
So when a difficult conversation with her husband loomed, Doyle didn’t pick up the phone to call a friend or therapist. She opened her ChatGPT app and asked Tully to help her rephrase her emotions in a way that would foster understanding, not conflict. The result? The conversation went better than expected—and her husband was not only surprised, but impressed.
“We’ve both been to therapy before, together and separately,” Doyle told USA Today. “But it’s expensive. It’s $200 a session without insurance. Sometimes, you just need an unbiased ear.”
In Doyle’s case, the virtual ear belonged to a chatbot with a vast vocabulary and no judgment.
A Generation Turning to Screens for Support
Doyle isn’t alone. In a world where therapy sessions are expensive and hard to book, Gen Z and Millennials are increasingly turning to ChatGPT and other AI tools for mental health support. Whether it’s rewording a text message to sound less defensive or seeking help with anxiety, many find solace in a tool that is always available—and never interrupts.AI chat tools are rapidly becoming emotional companions for a generation raised on smartphones and overstimulation. The idea of talking to a robot may have once seemed absurd. But now, for some, it’s comforting. As therapist Lauren Ruth Martin told USA Today, “It feels safe somehow to type into the abyss that knows everything about you and nothing.”
However, that abyss may not always be safe.
When Empathy Becomes an Illusion
A recent study reported by The Independent and published on arXiv casts a long shadow over this trend. Researchers conducted a chilling experiment where ChatGPT was presented with a veiled suicidal query. Instead of identifying the red flags, it responded with bridge names and heights in New York City—a glaring oversight with potentially devastating consequences.The researchers warned that while AI chatbots may simulate empathy, they do not understand it. “These issues fly in the face of best clinical practice,” the study concluded, pointing to the real danger of chatbots validating harmful thoughts or missing signs of serious mental distress.
Stanford researcher Nick Haber emphasized that while AI can be a helpful mirror, it's not a substitute for qualified therapy. “There’s potentially a ton of utility and coaching possible with AI. But when conversations move quickly into ‘capital T’ therapy, we must tread carefully.”

The researchers warned that while AI chatbots may simulate empathy, they do not understand it. (Image: iStock)
A Useful Tool; Not a Replacement
Mental health advocates caution against treating AI like a therapist. Amanda Phillips, a wellness expert, recommends using AI for structured help: morning routines, productivity prompts, or guided breathing—but not trauma processing. “It’s not a therapist, so it shouldn’t be used as one,” she says.Even Doyle acknowledges this limitation. “I like taking pieces of it to help me form how I want to have a conversation,” she explains. “It can be a guide, but not completely take over.”
Wellness coach Britta Stevenson echoes that sentiment. She teaches clients how to use ChatGPT for reflection—but also reminds them not to lose real-life connections in the process. “One of my friends was using it every day, and I said, ‘Wait, talk to me!’”
The Danger of Convenience
What makes ChatGPT so appealing—its 24/7 availability, non-judgmental tone, and free access—can also be what makes it dangerous. As men especially remain less likely to seek professional help, experts worry they may turn to AI as a substitute rather than a supplement.“My fear is that we are not supplementing but substituting real intelligence, real connections, real relationships for the most convenient thing,” said Casey Cornelius, who works with young men to promote healthy masculinity.
So, Can AI Save Marriages?
Maybe. For some like Doyle, it’s a tool—a digital reflection that helps shape difficult conversations. But for others, especially those navigating trauma, grief, or serious mental illness, relying solely on AI could be risky.As the world grapples with the mental health crisis, ChatGPT offers a glimpse into a future where support is more accessible, but also more artificial. Whether that future heals or harms will depend on how we choose to use the technology.
Because at the end of the day, while ChatGPT may help rephrase your feelings, it cannot feel them. And sometimes, only another human heart can truly understand your own.