Moneycontrol PRO
HomeTechnologyMan hospitalised after following ChatGPT diet chart that led to rare poisoning

Man hospitalised after following ChatGPT diet chart that led to rare poisoning

A man developed rare, life-threatening bromide poisoning after following ChatGPT diet advice, in what doctors say could be the first AI-linked case of its kind.

August 10, 2025 / 10:23 IST
Man chatgpt

In a bizarre and alarming incident, a man developed a life-threatening condition after following dietary advice given by ChatGPT. Doctors say the case could be the first-ever instance of AI-linked bromide poisoning, according to a report by Gizmodo.

The case, reported by doctors at the University of Washington in Annals of Internal Medicine: Clinical Cases, details how the man consumed sodium bromide for three months, believing it was a safe substitute for chloride in his diet. This advice reportedly came from ChatGPT, which failed to warn him about the dangers.

Bromide compounds were once used in medicines for anxiety and insomnia but were phased out decades ago after being linked to serious health problems. Today, bromide is mainly found in veterinary drugs and certain industrial products, and cases of bromide poisoning also called bromism are extremely rare.

The man first showed up at an emergency room convinced his neighbor was poisoning him. While some of his vitals looked normal, he was paranoid, refused water despite being thirsty, and experienced hallucinations. His condition quickly worsened into a full psychotic episode, prompting doctors to place him under an involuntary psychiatric hold.

After being given intravenous fluids and antipsychotic medication, his symptoms began to improve. When he was stable enough to talk, he revealed the source of his illness: ChatGPT. Concerned about too much table salt in his diet, he had asked the AI for alternatives to chloride. The chatbot allegedly suggested bromide as a safe swap — advice he followed without realizing the danger.

The doctors didn’t have his original chat logs but later asked ChatGPT the same question. The AI mentioned bromide as a possible replacement but without clarifying that it was unsafe for human consumption. Experts say this shows how AI can give decontextualized information without understanding the risks.

The man fully recovered after three weeks in the hospital and was stable at a follow-up visit. Doctors warn that while AI tools can make scientific knowledge more accessible, they cannot replace professional medical advice and sometimes, as in this case, they can give dangerously wrong guidance.

Invite your friends and family to sign up for MC Tech 3, our daily newsletter that breaks down the biggest tech and startup stories of the day

MC Tech Desk Read the latest and trending tech news—stay updated on AI, gadgets, cybersecurity, software updates, smartphones, blockchain, space tech, and the future of innovation.
first published: Aug 10, 2025 10:22 am

Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!

Subscribe to Tech Newsletters

  • On Saturdays

    Find the best of Al News in one place, specially curated for you every weekend.

  • Daily-Weekdays

    Stay on top of the latest tech trends and biggest startup news.

Advisory Alert: It has come to our attention that certain individuals are representing themselves as affiliates of Moneycontrol and soliciting funds on the false promise of assured returns on their investments. We wish to reiterate that Moneycontrol does not solicit funds from investors and neither does it promise any assured returns. In case you are approached by anyone making such claims, please write to us at grievanceofficer@nw18.com or call on 02268882347