AI as Your New Therapist: The Rise of Digital Emotional Support

There's no denying that we're firmly in a digital age. While there are no flying cars just yet, we're inching closer to that reality, especially since the development of AI in the past 10 years. What started as a tool to assist you with shopping lists and presentation building at your 9-5 has now morphed scarily into an aid for Deepfakes and scam calls. Oh, and it's terrible for the environment! Yet, one of AI's more intriguing recent developments is the turn toward these tools, such as ChatGPT and Gemini, like one would a therapist. 

Out of 1500 UK respondents in a recent YouGov survey, just one-fifth of people said they would log onto an AI platform to discuss their feelings. The US test group saw that one-third felt comfortable using AI in this capacity. When you think about it, we already confide so much in the technology in our lives; if it's creating personal anecdotal TikTok content or beefing with trolls in public comments sections, it's not majorly surprising that Gen Z would feel so comfortable opening up to a screen in this way.

It wasn't until a friend of mine casually dropped her new use of ChatGPT during a friendly catch-up that this concept entered my awareness. A bot therapist felt like a step too far. How could a robot possibly aid you in an emotional crisis? Would it feed you saccharine positivity quotes directly from Google and rehash your input into a more digestible output? None of it made sense to me, but her passion for her new confidant convinced me to listen.

This friend (26 F) was going through a tough time and needed support during the festive season when friends were busy shopping, celebrating and out of town. So, instead of pulling her friends away from their gatherings, she asked AI for some spur-of-the-moment advice. When asked why she didn't just book a last-minute therapy appointment, she said, "Sometimes an issue doesn't feel big enough to consult a therapist, but you would still like to discuss it. The fact that it's not a person makes it slightly less daunting; it feels more like talking to yourself. It's like going to the screen to buy your McDonald's meal instead of going up to the counter and having to talk to a real person instead". This makes sense, given the growing inaccessibility to therapy, the wait times for appointments in the public system, and the rising costs in the private one. 

For her, using the tool was like using a digital-reactive diary: "There are things you would write in your journal that you wouldn't say to your friends. ChatGPT felt a bit like that—like a nonjudgmental place, where it's okay to open up." 

So, what's the harm in using AI tools this way? Psycho-therapist Paul Fitzsimons says that his primary concerns with turning toward bots for emotional support would be around safeguarding and risk. He says, "I can certainly see how beneficial and accessible AI therapy might be for someone experiencing mild to moderate anxiety or depression, with a willingness to do the work themselves in terms of growth and change." On the downside, there's a danger of relying too much on the tool, bias, and people not understanding the severity of their issues, such as complex trauma. As well as the concerns around confidentiality that supersede the topic at hand but impact the use of AI more widely. 

However, Fitzsimons suggests that using AI to assist therapists (human ones, that is) could help combat the accessibility crisis for patients. This may look like "a screening call in a service that might recognize the person's presenting issues aren't too severe and that they could benefit from such a platform in the same way they might benefit from the likes of Silver Cloud or a good CBT app."

When you break it down, this shift has two key components: a means issue and a perhaps even greater one, a vulnerability concern. Strategist Abbey Gaunt has noticed the shift away from confiding in those around us as there are more options than ever to lean into AI instead of human connection. She states that this step away from opening up to those closest to us is a cocktail of "fear of judgment" and "not living up to the Sex and the City-inspired, picture-perfect friendships we see over matcha and Pilates on our feeds." As we inspire to uphold our pristine image to those around us in this Social Media forward age, we've found an alternative route to the connection essential to us as a species without the risk of judgment. "AI doesn't judge. While therapist offices are often filled with white, middle-class privilege, anyone can access AI and dig into their struggles with nothing more than a free click, human friendships intact", she goes on to say.

The moral of the story is that it's all about balance. While we as a society have already shifted toward living much of our lives digitally, the need for genuine human connection will never diminish. Leaning into these new tools to support fleeting moments of heartbreak when you're in need of some low-stakes advice or an interactive journaling session is not the worst thing, despite the dystopian element of it all.

Next
Next

Sad Girl Season