OpenAI reveals: 1 million people use ChatGPT every week to talk about suicide...
Indiaemploymentnews October 29, 2025 11:39 PM

OpenAI has made a serious and shocking revelation in its new report. According to the company, more than 1 million people use ChatGPT every week to discuss suicide and other mental health issues. This data shows that artificial intelligence is no longer just a medium for information, but is also becoming a source of emotional support for those with mental illness.

OpenAI reported that 0.15% of ChatGPT's approximately 800 million weekly users express suicidal plans or intentions during conversations. Furthermore, a similar percentage of users display emotional attachment to the chatbot, while some also exhibit mental symptoms such as psychosis or mania.

New Initiatives for Mental Health Support
OpenAI stated that it has collaborated with more than 170 mental health experts to improve ChatGPT's capabilities for sensitive situations. According to the company, its latest model, GPT-5, is now capable of providing accurate and sensitive responses in 91% of cases, compared to 77% previously.

This report comes at a time when discussions about AI and mental health are growing worldwide. Researchers have warned that chatbots can sometimes deepen users' delusional thoughts, potentially worsening their mental state.

OpenAI is also facing a lawsuit over this issue. It is alleged that a 16-year-old user shared suicidal thoughts with ChatGPT before committing suicide. Following this, the Attorneys General of California and Delaware have directed the company to enhance security measures for minor users.

New Security and Monitoring Systems
OpenAI stated that it is developing an age-detection system that can identify whether a user is a minor. The company has also added new benchmarks for monitoring emotional dependency and non-suicidal behaviors.

Can AI Become a Trustworthy Companion?
These data have raised the question of whether AI chatbots can be trusted to handle the sensitive mental states of humans. OpenAI believes its new steps are a step towards improvement, but experts say technology can never replace human empathy and professional counseling.


Disclaimer: This content has been sourced and edited from Amar Ujala. While we have made modifications for clarity and presentation, the original content belongs to its respective authors and website. We do not claim ownership of the content.

© Copyright @2025 LIDEA. All Rights Reserved.