Culture & Lifestyle
AI: A digital shoulder to lean on
As technology steps into the therapist’s shoes, people are turning to AI for emotional relief. But can a machine understand human pain?
Anish Ghimire
You might think that AI imitating psychotherapists is a new phenomenon. However, as early as 1966, MIT professor Joseph Weizenbaum created a chatbot that would simulate the responses of a psychotherapist. According to the Guardian, using an electric typewriter connected to a mainframe, users could type their thoughts and receive automated, supportive replies—an early glimpse of how humans could form emotional attachment with machines.
Weizenbaum named it ‘Eliza’ and even published a sample in a journal article proving the bot worked.
Fast forward to 2025, more and more people are turning to AI for emotional relief. Alison Prasai, 24, who studies at King’s College Nepal, says, “It is easier to open up to AI than to a real person because there is no fear of judgment, no awkwardness, and it feels safer to express my thoughts.” This is one of the primary reasons people vent to AI—to avoid having their vulnerability judged. She goes on to say that she doesn’t want to burden other people with her problems. So, she chooses the secure and anonymous environment of AI.
A well-known mental health AI chatbot, Woebot Health, writes on their website, “A severe lack of therapists, confusing insurance jargon, and scheduling headaches—the mental health care system is unable to support all who need it.” Nishant Dhungana, specialising in industrial-organisational psychology, carries a similar opinion: “AI is available 24/7, with no wait times, and appointments hassle like in real therapy. This can be incredibly reassuring during moments of distress or loneliness.”
Language models like Chatgpt can simulate parts of real therapy. It can mirror aspects of Cognitive Behavioural Therapy (CBT), offer coping strategies, or help reframe negative thinking.
That being said, AI chatbots cannot make diagnoses or prescribe medications. “One should use AI only for daily emotional check-ins like journaling support, reframing negative thoughts, learning coping tools, processing tough conversations, and building emotional intelligence,” says Anjina Sapkota, the co-founder and psychologist at Mindwell Solutions.
Users who turn to AI for emotional relief may have likely fallen under the Eliza effect. Weizenbaum observed that many people, despite being informed that Eliza was just a machine, still disclosed intimate details about their lives. These people thought there was something vast inside this machine that understood their woes. So, if you have ever felt a ‘connection’ with Chatgpt, you have likely fallen under the Eliza effect.
Dipesh Jnawali, a 23-year-old BALLB student, says AI feels “familiar” to him because “it understands the kinds of questions I usually ask and the patterns in my behaviour.” Since Chatgpt retains a memory of the user’s prompts, it develops a good understanding of the user and provides tailored and personalised advice whenever asked. “Something hard to get from another person,” he says.
Although AI may be helpful to a certain extent, it should not be trusted in extreme cases, especially for those “suffering from severe depression, anxiety, and psychotic disorders such as schizophrenia and some suicidal or homicidal tendencies,” says Sapkota, “because AI might respond without understanding how emotionally fragile a person is.”
AI also isn’t equipped (not right now, anyway) to detect early signs of suicidal thoughts like withdrawing from others, extreme mood swings, or feelings of hopelessness. These things require immediate attention, which is why it is “important to stay connected with people who care about you and seek professional help when needed,” adds Sapkota.
AI’s short-term support may offer temporary relief, but in the long run, it could be harmful as it lacks connection to real-life support systems.
“It is more like an emotional first-aid kit than a treatment plan,” says Dhungana.
A person’s attachment style also makes a difference when adopting AI counselling tools. People with anxious and avoidant attachment styles may feel safe opening up to a nonhuman, nonjudgmental entity. However, research led by the University of Canterbury, New Zealand, found that people with anxious attachments—characterised by heightened preoccupation with relationship security and fear of abandonment—were more likely to use AI therapy tools. Whereas, those with avoidant attachment did not show a strong tendency to use AI for emotional support.
“AI like Chatgpt lacks therapeutic tools, and it also doesn’t understand many cultural nuances, making it an unreliable source to turn to in moments of crisis,” says Santosh Sigdel, executive director at Digital Rights Nepal. “Alongside this, when we share intimate details of our lives, there is also a risk of privacy breach.”
Since AI lacks empathy, adaptive insight, and the nuanced understanding of a trained mental health professional, it can’t recognise when someone needs crisis intervention, nor can it develop a long-term, dynamic therapeutic relationship.
“While dealing with psychiatric patients, I frequently observe them using AI for inquiries about their medications or therapies. This reliance often leads to increased cognitive biases and the spread of social misinformation, ultimately exacerbating their mental health conditions,” says Sapkota.

Weizenbaum, who started it all, later turned against his own discovery and wrote in his book that AI is an “index of the insanity of our world.” Because the professor saw how superficial interactions with AI could easily mislead people. He saw this as indicative of a broader societal issue, where the allure of technology might overshadow genuine human connection and understanding.
Colin Fraser, a data scientist at Meta, wrote in his blog that Chatgpt is “designed to trick you, to make you think you’re talking to someone who’s not actually there.”
While most of us are aware of AI’s drawbacks and that it is actually ‘not there’, we keep on using it because it “mimics the experience of being heard, which meets a basic human need for relatedness,” says Dhungana. For Prasai, she just needs someone who listens to what she has to say without interruption. For Jnawali, using AI for advice is essential, as “just having someone who listens is not enough.”
However, Sigdel warns that we might become overly reliant on the non-judgmental nature of AI, which could lead to an obsession.
AI can create a loop that pulls us away from genuine human relationships. It can be a good place to start when the world feels heavy, but it cannot be a permanent fix. You can also use alternative coping mechanisms, such as journaling, practising deep breathing exercises, paying attention to your thoughts, acknowledging your feelings, talking with someone you trust, or learning from real-life experiences. These methods can be more grounded and authentic than interacting with AI.
Additionally, there are other therapy apps, such as BetterHelp, Talkspace, 7 Cups, and MindDoc, which connect users with licensed therapists and mental health professionals. “These platforms also offer video or chat sessions, mood tracking, journaling, CBT-based exercises, and self-help resources,” says Sapkota.
While AI tools can offer solace during lonely moments, it’s essential to remember that genuine human connections are the source of healing. If you feel alone and have no one to share your struggles with, please consider reaching out to a mental health professional. Therapy can be expensive, but some spaces offer free or low-cost counselling services, and there are also mental health helplines that are just a call away.
It’s okay to lean on AI for small moments of relief, but for deeper wounds, human intervention is essential. Technology can lend a digital shoulder to lean on, but nothing replaces the warmth of a real connection.