AI Therapy: A Useful Tool, Not a Therapist

· therapy,counseling,AI,artificial intelligence

Artificial intelligence is increasingly present in mental health spaces. Chatbots offer “supportive conversations.” Apps promise mood tracking, coping tools, even something resembling therapy. I recently learned from a student therapist that her school is using AI clients in practicum for training purposes.

Clients sometimes ask what I think about this.

I think there are real benefits. I also think there are real limits.

Both deserve to be named clearly.

What AI Can Offer

AI-based mental health tools can lower barriers to care. They are available 24/7. They don’t require insurance. They don’t place someone on a waitlist. For people who feel overwhelmed, isolated, or unsure where to start, this matters.

AI can also be useful for basic psychoeducation about common concerns like anxiety or depression:

  • Simple grounding or relaxation exercises.
  • Journaling prompts.
  • Mood or habit tracking.
  • Gentle reminders of coping strategies.

Some platforms have even helped normalize the idea that mental health support can happen outside a traditional therapy office. Normalization is important. Physical health is normalized, for example, and physical health guidance is readily available. There is no stigma attached to it. If AI lowers the threshold for someone to begin paying attention to and regulating their inner world, that’s a meaningful contribution.

What AI Cannot Replace

AI is not real relationship.

AI does not notice subtle shifts in affect. It doesn't sense hesitation, dissociation, shutdown, or avoidance. It can't hang on to a felt understanding over time. It can mimic empathy, but it can't empathize.

Therapy is not primarily an exchange of techniques, advice, or regurgitated data from previously recorded conversations.

Therapy is relationship.

It is built through attunement, consistency, repair, and trust. It unfolds slowly. It adapts moment to moment based on who you are and what you feel.

Risks Worth Noting

1. A False Sense of Being Known

AI can simulate empathic language, and good language can feel deeply validating. But hearing words of understanding is different from being understood. AI cannot truly relate to you. It certainly can't feel. And no matter how advanced the technology becomes, it will never have real emotions, or real empathy for you in the way that a human being attuned to your unique history, patterns, and nervous system might.

That distinction matters. A lot.

2. No Real Clinical Judgment

AI cannot reliably assess risk, safety, or complexity. It does not truly evaluate suicidal ideation, psychosis, domestic violence, child abuse, or medical issues. These are not rare in therapy, and taking appropriate action is central to ethical practice.

3. No Accountability

Licensed therapists are bound by ethical codes, state regulations, and continuing education requirements. AI systems are not accountable in the same way. If harm occurs, there is no clinician responsible for repair.

4. Privacy and Data Concerns

Barring a few exceptions, such as risk of harm to yourself or others, therapists are bound by ethical codes to hold your words and your records as confidential. Many AI platforms store, analyze, and share user data. Policies change. Ownership changes. Platforms are bought and sold. Most people do not fully understand how this affects how their information may be used.

5. Substitution for Human Contact

For some, AI becomes a preferred alternative to reaching out to a real person. This may actually increase their sense of isolation and loneliness. At worst, it's dangerous due to reasons noted above.

A Balanced View of What AI Is and Isn't

AI is best understood as a mental health support tool, not therapy.

It can complement human care. But it cannot replace the relational core of therapy.

Much like a workbook, a meditation app, or a self-help book, AI may offer structure and reflection. The deeper work still happens in relationship. With yourself and with another human being.

I don’t view AI as the enemy of therapy (at least not yet, although that time may be coming). I view it as a sign that people want:

  • Lower cost
  • Less stigma
  • More flexibility
  • More autonomy
  • Immediate access

Those are reasonable desires. Hopefully, the future does not mean choosing between AI and therapists, but rather expanding access to human care while using technology thoughtfully and ethically. All without losing what makes therapy therapy.

If you use AI tools and find them helpful, that makes sense. If you’re struggling and relying only on AI, I want you to know that you deserve more support than that. Not because AI is bad, but because you are human and AI is not. Humans heal in relationship with other humans.