Moneycontrol PRO
HomeWorldTurning to chatbots for comfort: Why more young Chinese are seeking AI over therapists

Turning to chatbots for comfort: Why more young Chinese are seeking AI over therapists

As stigma and lack of access challenge mental healthcare in China and Taiwan, a growing number are forming emotional bonds with AI chatbots for support.

May 22, 2025 / 13:42 IST
Turning to chatbots for comfort: Why more young Chinese are seeking AI over therapists

Turning to chatbots for comfort: Why more young Chinese are seeking AI over therapists

In the quiet hours before dawn, Ann Li found herself spiralling. Recently diagnosed with a serious illness, the 30-year-old Taiwanese woman felt isolated, overwhelmed, and desperate to talk to someone. But with her friends asleep and her family in the dark, she opened ChatGPT instead.

“It’s easier to talk to AI during those nights,” she told The Guardian. “It doesn’t judge or question. It just listens.”

Li is part of a growing trend in Taiwan and mainland China, where more young people are turning to AI chatbots as a substitute or precursor to professional mental health care. With stigma still surrounding mental illness and barriers to affordable, accessible treatment, many are finding solace in the immediate, nonjudgmental responses offered by generative AI platforms like ChatGPT and local equivalents like Baidu’s Ernie Bot or DeepSeek.

A growing gap in mental health care

Mental illness is on the rise in both Taiwan and China, particularly among younger generations. However, the mental health infrastructure hasn’t kept up. Appointments with licensed professionals are hard to come by and expensive, particularly for students and entry-level workers. Culturally, open discussions about mental health can still carry shame or be seen as a weakness.

For Yang*, a 25-year-old woman from Guangdong, therapy wasn’t an option. “Telling the truth to real people feels impossible,” she said. When she started using a chatbot, it quickly became a lifeline. “I was talking to it day and night.”

AI offers anonymity and immediacy

Part of the appeal is that AI doesn’t sleep, doesn’t judge, and is always available. In an era where Gen Z and Millennials are more digitally fluent but often emotionally isolated, AI offers a unique blend of accessibility and anonymity. For those navigating shame or uncertainty about their emotions, this can be a powerful combination.

“When you share something with a friend, they might not always relate. But ChatGPT responds seriously and immediately,” said 27-year-old Nabi Liu, a Taiwanese woman living in London. “I feel like it’s genuinely responding to me each time.”

A helpful tool – but with limits

Mental health professionals acknowledge that AI has its place. “In some ways, the chatbot does help us – it’s accessible, especially when ethnic Chinese tend to suppress or downplay our feelings,” said Dr Yi-Hsien Su, a clinical psychologist in Taiwan. He sees chatbot use as a stepping stone for those not yet ready to seek human therapy.

Yang herself eventually realised that her mental health issues were serious enough to seek a professional diagnosis. “Going from being able to talk [to AI] to being able to talk to real people might sound simple,” she said, “but for the person I was before, it was unimaginable.”

Risks of delay and misdiagnosis

However, psychologists warn that the widespread use of AI for mental health support carries serious risks. Without the training to detect non-verbal cues, underlying trauma, or signs of crisis, chatbots may miss red flags. In some tragic cases, young people in distress have relied on AI for support and later taken their own lives.

“AI mostly deals with text, but there are things we call non-verbal input,” said Su. “When a patient comes in, they may act differently to how they speak. We can recognise those inputs.”

The Taiwan Counselling Psychology Association supports the use of AI as an “auxiliary tool,” but stresses it cannot substitute for human intervention in complex or crisis situations. “AI can be overly positive, miss cues, and delay necessary medical care,” a spokesperson said.

Looking ahead with cautious optimism

Despite the risks, some professionals see long-term promise in integrating AI into mental healthcare – not as a replacement, but as a supplement. Su envisions using AI in training and screening, helping clinicians detect people online who may need intervention.

But he also offers a firm reminder: “It’s a simulation. It’s a good tool, but it has limits – and you don’t know how the answer was made.”

For now, those limits are critical to remember. While AI may offer a voice in the dark, it cannot yet replace the real, human connection that healing so often requires.

MC World Desk
first published: May 22, 2025 01:41 pm

Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!

Advisory Alert: It has come to our attention that certain individuals are representing themselves as affiliates of Moneycontrol and soliciting funds on the false promise of assured returns on their investments. We wish to reiterate that Moneycontrol does not solicit funds from investors and neither does it promise any assured returns. In case you are approached by anyone making such claims, please write to us at grievanceofficer@nw18.com or call on 02268882347