
Deep into the night, loneliness creeps in. The emptiness settles, and your body grows restless. You reach for your phone and message that one friend who’s always just a chat away: “I’m not feeling okay. Can you help me?”
The reply comes almost instantly: “I'm really sorry you're feeling this way. You're not alone, and I'm here to support you however I can.”
It’s not a person — it’s ChatGPT (or another app, of course!). And it’s ready to help.
With its calm, reassuring tone, it asks thoughtful questions and offers suggestions — just enough to make you feel seen and heard.
In a world where mental health struggles can often be dismissed as “all in your head,” therapy remains costly, and personal relationships feel increasingly difficult to maintain, ChatGPT and other artificial intelligence (AI) apps — available 24/7 and often free or with low-cost subscriptions — have become an alternative source of emotional support.
Several studies have found AI to be a promising tool for emotional support — sometimes even outperforming mental health professionals. In fact, some people are now deliberately choosing AI platforms when seeking help.
A study by Foyen et al. (2025) examined how licensed psychologists and psychotherapists rated the quality of advice given by AI compared to human experts.
The findings suggest that AI-generated psychological advice is on par with expert asynchronous written advice in terms of scientific quality and cognitive empathy — and even surpasses expert responses in emotional and motivational empathy.
Meanwhile, a Reddit user shared an anecdotal account claiming that ChatGPT outperformed his therapist in attentiveness. “I wrote a question going into detail about some very personal issues that I'm struggling with. ChatGPT responded to my whole question. It didn't just pick out one sentence and focus on that. I can't even get a human therapist to do that. In a very scary way, I feel HEARD by ChatGPT,” the user said. While individual experiences like this can be compelling, they do not necessarily reflect the broader effectiveness of AI for everyone.
On the other hand, critics raise concerns about data privacy. Since user conversations may be reviewed to improve AI systems, there is a risk that sensitive information could be exposed.
ChatGPT itself warns users not to share personal or confidential details, as chats may be used for training and system improvements.
Moreover, First Session, a Canadian mental health platform, stressed that while AI is instantly accessible and judgment-free, it lacks the depth and nuance of human connection. The complex emotions, experiences, and ideas that define human beings cannot be fully replicated by artificial intelligence.
The platform also noted that AI often provides only the illusion of support. Without genuine emotional processing, ongoing therapeutic work, and human connection, meaningful change is unlikely to occur.
Still, illusion or not, for many, AI therapy fills a gap — just enough to keep them going.
At the end of the day, AI is still just a tool — not a replacement for professional mental health care. While it can help organize your thoughts, offer perspective, and provide immediate comfort, it should not be used as a substitute for speaking with a qualified therapist, counselor, or mental health professional who can provide the depth of care needed for long-term well-being. As artificial intelligence continues to evolve, it may work alongside mental health professionals — making the world a better place, and our minds a better state.