People trust AI over doctors for medical advice, MIT study
NewsBytes January 25, 2026 07:40 PM


People trust AI over doctors for medical advice, MIT study
25 Jan 2026


A recent study by researchers from the Massachusetts Institute of Technology (MIT) has found that people trust medical advice from artificial intelligence (AI) more than that from human doctors.

The research, published in the New England Journal of Medicine, involved 300 participants who were asked to evaluate medical responses generated by a doctor or an AI model like ChatGPT.


AI responses rated more accurate and trustworthy
Study findings


The study participants, both experts and non-experts in the medical field, rated AI-generated responses as more accurate, valid, trustworthy, and complete.

Neither group could reliably distinguish between AI-generated responses and those provided by human doctors.

This suggests that participants may favor AI-generated responses, despite the potential for inaccuracies in the information provided by these systems.


Participants showed high tendency to follow AI's low-accuracy advice
Risk of misinformation


The study also found that participants rated low-accuracy AI-generated responses as valid, trustworthy, and satisfactory.

They also showed a high tendency to follow this potentially harmful medical advice and seek unnecessary medical attention as a result.

This highlights the risk of misinformation from AI systems in the field of medicine, which could have serious consequences for patients' health and well-being.


Cases of harmful AI-generated medical advice
Real-world incidents


There have been several documented cases of AI giving harmful medical advice.

One case involved a 35-year-old Moroccan man who had to go to the ER after a chatbot told him to wrap rubber bands around his hemorrhoid.

In another incident, a 60-year-old man poisoned himself after ChatGPT suggested eating sodium bromide could reduce his salt intake.

These cases highlight the potential dangers of relying on AI for medical advice.


AI's recommendations often lack scientific backing
Expert concerns


Dr. Darren Lebl, research service chief of spine surgery for the Hospital for Special Surgery in New York, has previously expressed concerns over AI-generated medical advice.

He said that many recommendations from these programs aren't real scientific recommendations with actual publications behind them.

"About a quarter of them were made up," he revealed, highlighting the potential inaccuracies and risks associated with trusting AI for medical guidance.

© Copyright @2026 LIDEA. All Rights Reserved.