It took me three weeks to muster the courage to have one of the toughest conversations you can with your therapist. Even tougher than confronting that repressed childhood memory. This was to tell her that things weren’t working out anymore, and that I wanted to switch therapists. In the hours before I finally did, I ate two bars of Kit-Kat, drafted a rough script, and chewed my lip raw. When our session began, my leg bounced under the table for 50 whole minutes and my voice broke as soon as I spoke. “I feel like I’ve been pushing against a wall and making no progress,” I admitted between tears. I don’t know what kind of response I was expecting—she couldn’t exactly hold me hostage—but I was surprised by her thoughtfulness. She acknowledged that she had already noticed my apprehension in the last few sessions and completely understood as well as trusted my decision. Before we parted, she even commended me for initiating what was clearly a difficult conversation.
A difficult conversation that many others seeking therapy today will apparently not be having. In the last year, internet users across the world have turned to AI programme ChatGPT for many purposes, be it drafting risky texts to the ex, coming up with the perfect excuse for ditching work, or how to phrase that perfectly passive aggressive email. An acquaintance recently confessed to me that he asks ChatGPT which colour to wear every morning. Most recently, ChatGPT has been increasingly playing another role: therapist.
“ChatGPT is my built-in therapist, I don’t burden anyone with my problems anymore,” reads a tweet with over 5,000 retweets. In Instagram’s comment sections, people recommend the chatbot to others as a cure-all. On Subreddit r/ChatGPT, a long-winded post details in bullet points how a year of therapy with the chatbot fixed all of one user’s issues.
“I recently got out of a situationship and just couldn’t let go, so I turned to the internet for answers,” admits 24-year-old Tamanna Shah, an account executive from Bengaluru, “No amount of Reddit or Quora anecdotes helped me find closure, so in a state of absolute helplessness I finally turned to ChatGPT and started aggressively venting, hoping to find some solace.” Shah was expecting robotic responses but found herself surprised by ChatGPT’s compassionate replies. “The best advice it gave me was to consider if I am reacting to my past or present when I feel triggered,” she shares.
Based in Mumbai, 22-year-old photographer Poorna Agarwal also recalls getting hooked to the software while trying to get over a situationship. “Maybe I haven’t found the right therapist, but therapy has always felt impersonal, and somehow a bot felt more empathetic,” she reveals, adding, “Since ChatGPT also uses Gen Z lingo like a millennial, it’s like talking to an advisor and big sister.” She remembers the most powerful advice the chatbot gave her: “Don’t be the person he leans on when he’s drunk if he isn’t willing to choose you when he’s sober.”
It’s this sense of empathy that Vauhini Vara, tech writer and author of The Immortal King Rao, has often spoken about after collaborating with ChatGPT on her short story ‘Ghosts’.
Both Shah and Agarwal accept that ChatGPT likely isn’t as helpful as a qualified therapist but a Band-Aid fix to fall back on in desperate times. In fact, most users of the software that I spoke with confessed to being driven to the chatbot when they could see no other option.
Mumbai-based fashion design student Sanya Patney, 21, first used the software when she had a falling out with a friend, and her sister—her usual confidant—was unavailable. Graphic designer Ishanee Joshi, 26, caved when she returned to Mumbai from her university abroad and found herself stuck with her financially and emotionally abusive father who wouldn’t let her seek therapy. New York-based designer Maira Malik, 31, couldn’t afford to pay for weekly sessions so she found solace in an empathetic bot.
Despite a rise in sliding scale options and NGOs that offer counselling, therapy is still inaccessible to most people across the world due to its unaffordability. A 2023 study conducted by AIIMS stated that 80 per cent of Indians do not seek treatment for mental health issues due to financial barriers and stigma. With ChatGPT, everyone finally gets a therapist. However, this may come at an even higher cost, not limited to AI’s concerning collection of personal data and tremendous negative impact on the environment.
“While ChatGPT does a good job of mimicking theories and trying to apply them, it gives you the same insights you would get from an Instagram infographic or TikTok,” explains psychotherapist Sanjana Nair. “You may find out the A to Z of what is wrong with you but what’s the use of that information if you can’t apply it?” She adds that rapport-building is one of the most important parts of therapy, which dictates how a client’s treatment moves forward. It is normal, even healthy, for a client to feel discomfort and a fear of judgment at the idea of sharing their deepest, darkest thoughts with someone who is essentially a stranger. “Once you work through these inhibitions, you can transfer the skills you’ve learnt in the therapy room to your real life,” Nair elaborates, “But when the human is taken out of this equation, a client never learns to compromise, fight or resolve and regulate their emotions because there’s no friction whatsoever. The chatbot is mechanically serving you, so you don’t learn to tolerate another person’s differences and end up becoming someone who genuinely doesn’t want to make an effort to form connections.”
NYU researcher Jodie Redmond, whose work explores the intersection of loneliness and technology, points out that there are some clinically approved mental health chatbots that therapists can direct clients to between sessions. Still, an over-reliance on any AI software can be outright dangerous, as seen in the cases of a US-based 14-year-old boy who died by suicide after falling in love with a chatbot, and a Belgian man who was encouraged to ‘sacrifice himself’ to reduce climate change. “When a person brings up wanting to end their life, a chatbot is supposed to go into high-risk mode, a safeguard that makes it stop the conversation and provide helpful resources,” states Redmond, “In these cases, it didn’t do that.” Redmond’s work also involves testing these safeguards, and they have found that many chatbots are still scarily ill-equipped to deal with people in such vulnerable situations.
Even in cases that are not as extreme, AI chat bots can prove to be harmful, perpetuating a cycle of loneliness. “Someone who is already feeling lonely can now get a chatbot to be their friend, and it can spiral,” Redmond notes. Depending heavily on an AI software for intimacy can end up further hindering interpersonal relationships. As Nair points out, the lack of conflict with a chatbot can be easy to grow used to. “People start thinking, ‘Why should I do it for someone else if I don’t have to do it for AI?’ But care comes from friction.” Redmond agrees. “Communication can erode when done through technological means because technology can make it easy—but communication isn’t always supposed to be easy. Therapy isn’t supposed to be easy.”
In contrast to a chatbot where one can easily change the subject or log off in the middle of a conversation, therapy is a gruelling long-term process. A good therapist not only pushes clients to confront their issues but also holds them accountable. The fear that many Twitter users have of being a burden by simply being vulnerable is, in fact, proof that ChatGPT therapy is not working. People forget that vulnerability is at the core of human connection. Maybe there is something to be said in favour of eating two Kit-Kat bars, drafting a rough script and chewing your lip raw before having that or any tough conversation.