AI may have to be heavy blindly, know – Obnews
Samira Vishwas July 05, 2025 06:24 PM

Artificial Intelligence (AI) has made our life easy, but fully trusting it can sometimes be heavy. Recently, CEO Sam Altman of OpenAI has acknowledged himself that AI tools like ChatjPT cannot be 100% correct every time.

The incident of a 14 -year -old school child in Mumbai has made it clear that it is necessary to be cautious before consulting AI tools in medical matters.

There was abdominal pain, but the reason turned out to be something else
When the child complained of severe abdominal pain, his parents first asked the chat GPT about the symptoms. ChatGPT reported that this may be gastric infection. The nervous parents immediately took the child to Apollo Hospital in Mumbai.

However, when the doctor started investigating, the matter turned out to be something else.
The doctor said, “The child was not getting a look and was hesitant to answer the questions. On talking with love, he told that senior children in school make fun of him and he does not want to go to school for this fear.”

In fact, the child did not have any physical illness, but anxiety attack – which is an emotional problem, not the stomach.

Doctors warning: AI does not know, just reads data
The doctor said that nowadays people take advice from AI because there is no judge there, they get immediate answers, and they can openly ask questions. But it should not be forgotten that AI cannot see us, cannot understand our facial expressions and cannot guess our mental state.

AI tools such as chatGPT suggests just based on input data.

“AI can be a support system, but not a doctor’s option. Especially not in matters of mental health.”

🧠 Why are people running towards AI?
early reply

No judgment

No hassle to see a doctor

Answers to countless questions in one click

But doctors say that this is the reason now being a danger bell.

Do you also throw away the rest of the night? Know why it is beneficial for health

© Copyright @2025 LIDEA. All Rights Reserved.