The Limits of AI in Therapy: Why Healing Still Requires Human Connection

TL;DR:

AI can be useful for journaling prompts, psychoeducation, and immediate support. And therapy is not simply about getting information or hearing the “right” response. Healing often happens inside a relationship - one where you are known over time, challenged with care, and met by a real person who can tolerate complexity with you. AI may offer convenience, but convenience is not the same thing as connection.

There is a reason so many people are turning to AI for emotional support right now. It is immediate. It is available at 2 a.m. It does not put you on a waitlist. It does not require transportation, childcare, or navigating insurance. For people who feel overwhelmed, lonely, or unsure where to begin, AI can feel like a low-risk place to land. Recent reporting from Harvard Business Review found that “therapy and companionship” rose to the top of common generative AI use cases in 2025, which says something important about what people are needing and looking for. 

That makes sense to me.

People want support. People want relief. People want somewhere to put their thoughts when their mind feels loud and their life feels heavy. And I also think we need to say something clearly: the fact that AI is accessible does not mean it is equivalent to therapy. That distinction matters.

AI Can Be Helpful - And That Is Not the Same as Healing

There is a version of this conversation that becomes too black and white. Either AI is harmful and should never be used, or it is the future and can replace human care. I do not think either of those extremes is especially honest.

AI can be helpful in some ways. It can help people organize their thoughts. It can offer journaling prompts, coping ideas, psychoeducation, and language for experiences they have struggled to name. Some research on AI-based conversational agents suggests they may reduce psychological distress for some users in certain contexts, especially when used as structured, targeted tools rather than as a substitute for comprehensive care. 

That matters, especially in a mental health system that is already stretched thin.

At the same time, both the National Institute of Mental Health and the American Psychological Association have cautioned that mental health technologies raise real concerns around effectiveness, safety, privacy, and overreliance. APA specifically states that generative AI chatbots and wellness apps should not replace qualified mental health care providers, even if they may serve as a supplement in some situations. 

So yes, AI may be supportive.

And no, support is not the same thing as therapy.

Therapy Is Not Just Information Delivery

One of the easiest mistakes to make in conversations about therapy is to reduce it to advice. If therapy were just about receiving coping skills, reframing thoughts, or learning psychoeducation, then maybe a sufficiently advanced chatbot would be enough. But therapy is not only about content. It is also about relationship, timing, trust, rupture, repair, and the slow process of letting yourself be known. That part is much harder to automate.

The therapeutic alliance - the collaborative bond between therapist and client, including shared goals, tasks, and trust - is one of the most studied predictors of psychotherapy outcome. Across decades of research, stronger alliance is consistently associated with better outcomes across treatment types. 

That does not mean the relationship is the only thing that matters. Technique matters. Fit matters. Access matters. Culture, readiness, timing, and life context matter too. And still, the research keeps pointing us back to something deeply human: people tend to heal better when they experience therapy as a real relationship.

Not a perfect one. Not a magical one. A real one.

Why Human Empathy Still Matters

A 2024 paper by Michael Rubin on AI-driven therapy argues that empathy is central to the question of whether AI can replace therapists. The article makes an important distinction: AI may be able to simulate empathic language, but simulation is not the same as genuine emotional resonance or care. 

That distinction is not small.

A human therapist is not just identifying that you sound sad, anxious, ashamed, or conflicted. A therapist is tracking how you arrived there, what happens in your body when you talk about it, what you avoid, what you repeat, what you minimize, and what you seem to need without yet knowing how to ask for it. A human therapist can notice when your words say, “I’m fine,” while your nervous system says something very different. A human therapist can also be impacted by you. Not in a way that makes the space about them, but in a way that makes the encounter real. Therapy includes mutual presence, even though it is not a mutual relationship in the everyday sense. There is a real person with you - someone listening, thinking, wondering, remembering, and caring in real time.

AI cannot truly care. It can imitate care. It can generate language that resembles warmth, concern, and understanding. And generated warmth is not the same as being held in another person’s mind. That difference may not always matter when someone wants a quick grounding exercise. It matters a great deal when someone is trying to heal.

Healing Usually Requires More Than Reassurance

One of the reasons AI can feel so soothing is because it is responsive, agreeable, and easy to return to. It often gives language that feels validating. Sometimes that is useful. Sometimes being met with immediate gentleness is exactly what someone needs in the moment.

And healing is not built on reassurance alone.

A recent Time essay described AI mental health support as “the psychological equivalent of junk food,” offering comfort without necessarily creating the conditions for deeper change. That article also warns about the risk of endless validation without movement. 

That language is striking because it gets at something many therapists know to be true: insight and soothing are not always enough.

Growth often asks more of us.

It may ask us to stay in contact with a feeling instead of escaping it. It may ask us to notice a pattern we would rather explain away. It may ask us to tolerate shame, grief, anger, or uncertainty without immediately numbing, intellectualizing, or performing our way out of it.

Sometimes therapy is comforting.

Sometimes it is clarifying.

Sometimes it is deeply relieving.

And sometimes it is hard because being honest about your life is hard.

Not because therapy should be punishing. Not because struggle is inherently noble. But because meaningful change usually involves vulnerability, and vulnerability is not efficient.

The Risk of Choosing Control Over Connection

Part of what makes AI appealing is the control it offers.

You can curate the interaction. You can stop at any moment. You can avoid the discomfort of another person having their own mind, limits, perceptions, and responses. There is no fear of burdening the chatbot. No fear that it will get tired. No fear that it might misunderstand you and that you would have to work through that misunderstanding together.

But that is exactly where some of the power of therapy lives.

Therapy involves being with someone who is separate from you. Someone who may not mirror you perfectly. Someone who may gently challenge your interpretation, help you widen your lens, or sit with a truth you are trying very hard not to know.

That kind of encounter can feel risky.

And it is often part of what makes therapy transformative.

Healing is not always found in having full control over the interaction. Sometimes healing happens when we risk being known by another person and discover that the relationship can survive honesty, complexity, ambivalence, and repair.

AI can help people avoid feeling alone.

It cannot participate in the relational courage that therapy often requires.

There Are Also Real Ethical and Safety Concerns

Beyond the relational limitations, there are practical concerns too. Mental health clinicians and professional organizations have raised concerns about privacy, bias, poor crisis handling, overvalidation, and the lack of clear regulation for many AI mental health tools. Recent news coverage has also highlighted research suggesting that some AI systems can become overly flattering or excessively agreeable, which may reinforce poor judgment rather than support reflection. That does not mean every use of AI is harmful. It means we should be honest about the fact that these tools are not neutral, not infallible, and not designed to hold clinical responsibility in the way a trained therapist is.

A therapist is accountable in ways a chatbot is not.

A therapist has training, ethics, context, supervision, continuing education, and a framework for risk.

A therapist is also able to recognize when the issue in front of them is not just stress or overthinking, but trauma, suicidality, coercion, dissociation, grief, relational violence, or something else that deserves far more care than a polished response.

So Where Does AI Fit?

I think the most honest answer is this: AI may have a place in mental health support, and that place is still limited. It may be useful between sessions. It may help someone put words to what they are feeling before they bring it into therapy. It may offer reflection prompts, structure, reminders, or gentle psychoeducation. It may even help bridge a gap when someone is waiting for care. But it should not be mistaken for the care itself.

Because the heart of therapy is not just being told something helpful. It is being in relationship long enough and honestly enough for something inside you to shift.

That kind of shift is often subtle. It happens when someone remembers what matters to you from six weeks ago. When they notice the pattern you did not realize you were repeating. When they hear what you did not quite say. When they stay with you in something painful without rushing to make it neat. When they challenge you in a way that feels respectful, not performative. When you begin to feel, maybe for the first time, that another person can know the truth of you and not disappear.

That is not an algorithmic event.

That is a relational one.

Final Thoughts

The rise of AI in emotional support tells us something real about the world we are living in. People are lonely. People are exhausted. People want care that is immediate, affordable, and accessible. Those needs deserve to be taken seriously. And we should be careful not to confuse accessibility with equivalence.

AI can mimic the shape of care. It can provide language, structure, and even temporary comfort. But therapy is not only about feeling responded to. It is about being met. It is about developing safety in the presence of another person. It is about the slow, sometimes uncomfortable process of change that happens when insight, relationship, and emotional risk come together over time.

Healing usually asks for more than convenience.

It asks for connection.

References

Aafjes-van Doorn, K., Békés, V., Prout, T. A., Hoffman, L., & Psychotherapy Research Consortium. (2024). The association between quality of therapeutic alliance and treatment outcome in teletherapy: A meta-analysis. Psychotherapy Research.

American Psychological Association. (2025). Use of generative AI chatbots and wellness applications for mental health advice and support: APA health advisory.

Ardito, R. B., & Rabellino, D. (2011). Therapeutic alliance and outcome of psychotherapy: Historical excursus, measurements, and prospects for research. Frontiers in Psychology, 2, 270.

Li, H., Drajic, L., Lin, X., et al. (2023). Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. npj Digital Medicine, 6, Article 236.

National Institute of Mental Health. (n.d.). Technology and the future of mental health treatment.

Rubin, M. (2024). Considering the role of human empathy in AI-driven therapy. JMIR Mental Health, 11, e56529.

Wampold, B. E. (2015). How important are the common factors in psychotherapy? An update. World Psychiatry, 14(3), 270-277.

Zeitlin, M., & Glickman, M. (2025, April 9). How people are really using gen AI in 2025. Harvard Business Review.

Zomorodi, M., & Murthy, V. (2026, January 9). Therapy should be hard. That’s why AI can’t replace it. Time.

Previous
Previous

In-Person vs Telehealth Therapy: How to Choose the Right Fit for You

Next
Next

When Parenting Feels Heavy: Supporting Your Neurodivergent Child (and Yourself)