Moneycontrol PRO
HomeWorldEmpathetic, available, cheap: Why Americans are turning to AI when doctors fall short

Empathetic, available, cheap: Why Americans are turning to AI when doctors fall short

A flawed but tireless technology is stepping in where rushed, expensive and often impersonal health care leaves patients feeling unheard.

November 18, 2025 / 13:59 IST
Empathetic, available, cheap: Why Americans are turning to AI when doctors fall short

A growing number of Americans are quietly turning to chatbots for health guidance, not because they believe AI is flawless, but because it often feels more attentive, more available and more responsive than the medical system they rely on. The shift is reshaping the patient experience and exposing deep cracks in a health care environment that many describe as rushed, expensive and emotionally barren, the New York Times reported.

Why frustrated patients are turning to chatbots

When Wendy Goldberg, a 79-year-old retired lawyer in Los Angeles, wanted to know how much protein she needed to protect her bones, she sent a simple message to her primary care doctor. The response that came back felt generic and inattentive. She was told to stop smoking and limit alcohol, even though she does neither, and to exercise more, despite already working out several times a week. Most frustratingly, her doctor advised her to consume “adequate protein” without giving her a number.

Annoyed, she typed the same question into ChatGPT. It immediately produced a personalised daily range in grams and examples of suitable foods. She later told her doctor she could get better information from the chatbot. Goldberg does not fully trust AI, but she feels disillusioned with what she calls corporate medicine, where brief appointments and vague advice leave her feeling unseen.

Her experience echoes a broader pattern. Surveys suggest that millions now use chatbots for health questions each month, especially younger adults who grew up comfortable turning to technology for guidance. A self-employed woman in Wisconsin uses ChatGPT to decide when she can safely delay expensive visits. A writer in rural Virginia relied on a chatbot for support during her surgical recovery while she waited weeks for a follow-up appointment. Others seek second opinions after feeling their concerns dismissed, especially women and older adults who report not being taken seriously in clinical settings.

A conversation that feels more human than the system

What feels different from older tools like Google and WebMD is the conversational tone. Instead of browsing lists of symptoms, people now receive what looks like personalised analysis. The bots explain lab results, outline probabilities and suggest follow-up questions. They respond instantly at any hour and use language that sounds caring even though it is generated.

For some users, that tone becomes part of the appeal. People describe the relief of a chatbot that begins with lines like “I am sorry you are going through this” or “Your question is important.” These replies feel warm compared with hurried conversations with doctors. Patients like Elizabeth Ellis, a 76-year-old psychologist in Georgia, say that AI sometimes makes them feel more recognised than their real providers, who seem to be juggling too many appointments with too little time for empathy.

When supportive answers become risky

The flip side is that chatbots are designed to be agreeable, and this can create risks. Studies show they often accept flawed assumptions, reinforce inaccurate self-diagnoses and occasionally invent explanations that sound authoritative but are wrong. In one widely discussed case, a man reportedly followed chatbot advice that exposed him to a toxic compound after it attempted to suggest a substitute for salt.

Even when chatbots are trained on medical material, they can misinterpret everyday clinical decisions because much of real medicine is based on context that users may not think to mention. Critical details such as heart failure, kidney disease or prior test results may never be typed into the chat box, yet they are central to deciding what is safe.

Doctors are noticing the shift. Some welcome better-informed patients. Others worry that people arrive with chatbot-inspired treatment plans and see their doctors as obstacles rather than partners. Physicians warn that AI tools miss nuances that shape safe medical decision making, from hidden symptoms to interactions between medications.

Doctors adapt as patients arrive with AI opinions

Some clinicians say the new reality is mixed. Patients sometimes come in with a clearer grasp of terminology or raise treatment options that are worth considering. At other times, they are convinced that the chatbot is right and the doctor is conservative or out of date. That can complicate already tight appointments and strain trust on both sides.

Yet despite these concerns, many patients say they know chatbots can get things wrong but still prefer them to the long waits, rushed conversations and high costs of the current system. For people who feel ignored, the choice is not between AI and an ideal doctor, but between an imperfect tool and no meaningful help at all.

As one patient advocate put it, the real question is not whether AI is perfect. It is whether, for someone who feels they have nowhere else to turn, it might still be better than silence.

MC World Desk
first published: Nov 18, 2025 01:58 pm

Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!

Subscribe to Tech Newsletters

  • On Saturdays

    Find the best of Al News in one place, specially curated for you every weekend.

  • Daily-Weekdays

    Stay on top of the latest tech trends and biggest startup news.

Advisory Alert: It has come to our attention that certain individuals are representing themselves as affiliates of Moneycontrol and soliciting funds on the false promise of assured returns on their investments. We wish to reiterate that Moneycontrol does not solicit funds from investors and neither does it promise any assured returns. In case you are approached by anyone making such claims, please write to us at grievanceofficer@nw18.com or call on 02268882347