
Doctors at AIIMS New Delhi have issued a strong warning against using AI tools like ChatGPT for medical diagnosis or treatment, after a patient landed up with severe internal bleeding by following advice generated by a chatbot. The warning came from Dr Uma Kumar, who heads the Rheumatology Department at the institute and spoke about the incident while interacting with the media.
According to doctors, the patient had been dealing with persistent back pain and decided to seek help online instead of visiting a clinic. Turning to ChatGPT, the patient asked for advice and was suggested commonly used painkillers. Trusting the response, the patient bought non-steroidal anti-inflammatory drugs from a pharmacy and started taking them without any medical consultation or tests. What followed was severe internal bleeding, a complication that could have been avoided with proper medical supervision.
The episode has set off alarm bells among doctors, especially at a time when artificial intelligence feels like an instant solution to almost everything. With answers delivered in seconds and written in confident, reassuring language, chatbots are increasingly being treated as virtual doctors. But medical professionals warn that this confidence can be dangerously misleading.
Dr Kumar explained that medicine is not about giving quick fixes. Doctors follow a careful process where they rule out different causes through physical examination, patient history and investigations before deciding on treatment. A chatbot, she pointed out, does not know the patient sitting on the other side of the screen. It cannot assess past illnesses, hidden risks or how a particular drug might react inside that specific body.
In this case, the painkiller advice may have sounded routine. After all, many people take similar medicines for back pain. But what the AI tool could not judge was whether the patient had a higher risk of stomach or internal bleeding. Without blood tests, scans or even a basic clinical check, the advice turned dangerous.
Doctors are also worried about what they call “AI hallucinations” — situations where chatbots provide answers that sound authoritative but may be incomplete or incorrect for a real person. While platforms like OpenAI, which runs ChatGPT, clearly state that their tools are not meant to replace doctors, many users tend to overlook these disclaimers when they are in pain or panic.
The incident has now sparked a larger conversation about public awareness and regulation. AIIMS doctors are urging people to use the internet and AI tools only for general information, not as a substitute for medical care. They stress that even medicines available without a prescription can cause serious harm if taken without guidance.
As AI becomes more deeply woven into everyday life, doctors say the responsibility lies with both technology companies and users. Convenience should never come at the cost of safety. When it comes to health, a human doctor — with questions, tests and careful judgement — remains irreplaceable.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.