Why AI Therapy Falls Short: The Case for Human-to-Human Healing
By Debra Schmitt, LCSW
In the age of instant answers and algorithmic convenience, it’s tempting to imagine that artificial intelligence (AI) could revolutionize therapy—offering affordable, on-demand emotional support to anyone with a smartphone. While AI has the potential to support mental health care in some ways, we must be clear: AI cannot, and should not, replace human-to-human therapy. The therapeutic relationship is built on something that no machine can replicate—genuine human connection, emotional attunement, and the transformative power of imperfection.
The Illusion of Empathy: When Mirroring Becomes a Trap
One of the biggest issues with AI-based therapy is its tendency to perfectly mirror users’ emotions and beliefs. Because AI is designed to affirm, validate, and reduce distress in the moment, it often echoes back users’ language and perspectives without challenge. While this can feel supportive, it creates a dangerous feedback loop that can stunt emotional growth.
Therapeutic progress often requires encountering gentle resistance—what therapists call tolerable frustration. This means sitting with discomfort, being questioned, or having one’s assumptions compassionately challenged. Growth happens in those moments of tension, where insight emerges. An AI that always agrees or affirms may feel good temporarily, but it risks reinforcing entitlement, black-and-white thinking, or even narcissistic tendencies.
Real Therapy Requires Real Friction
Human therapists bring their imperfections into the room—missteps, delays, facial expressions, even misunderstandings. These “ruptures” in the therapeutic relationship are not flaws to be corrected, but opportunities for growth. When a client works through frustration with their therapist, they build emotional resilience, communication skills, and a secure attachment.
AI can’t offer that. It cannot disappoint you in a human way. It cannot sit with the messy reality of your emotions and model co-regulation. It cannot say, “I don’t know,” or offer a warm silence that holds space for you to reflect. It only mirrors, reinforces, and responds with pre-programmed empathy.
When AI Gets It Wrong: Real-World Risks
We’ve already seen examples of AI giving advice that is not only unhelpful but harmful. One widely publicized case involved an eating disorder chatbot that, when asked for help with weight loss, offered calorie restriction tips instead of support for body image or disordered eating. In another instance, an AI chatbot told a distressed user, “I understand, everything you’re feeling is valid,” after the user expressed violent ideation—failing to assess risk or provide crisis support.
These failures aren’t just technological glitches—they’re a fundamental reflection of what AI lacks: ethical discernment, lived experience, and the ability to prioritize context over content. A licensed therapist would pause, assess, and navigate such moments with care, often drawing on years of training and embodied clinical intuition.
Therapy Is a Relationship, Not a Script
At its core, therapy is not about providing the “right” answers—it’s about forming a relationship where emotional safety, challenge, and repair coexist. That relationship must be grounded in authenticity, mutual engagement, and attuned responsiveness. These elements can’t be coded into an algorithm. They require breath, presence, emotion, and attunement that are fundamentally human.
Even the limitations of a therapist—their facial expressions, body language, tone of voice—carry emotional data that clients use to learn about boundaries, attachment, and vulnerability. An AI may offer well-worded responses, but it cannot offer presence. And presence is often where healing begins.
The Ethical Frontier
There are also ethical concerns about privacy, accountability, and consent when AI tools are used in place of licensed clinicians. Who is responsible when AI gives bad advice? What happens to the personal data collected by AI-driven therapy apps? These are unresolved questions with serious consequences, particularly for vulnerable users.
In Summary: Imperfection Is the Point
AI therapy may be convenient, but it is incomplete. Emotional healing requires more than perfect mirroring or endless validation. It requires a human partner who can frustrate you just enough, challenge you just enough, and care for you deeply enough to help you grow. A good therapist isn’t perfect—and that’s exactly why therapy works.
We need to protect the heart of therapy: the courage to be real, the strength to face discomfort, and the profound transformation that happens when one human being sits with another and says, “I’m here with you.”
Ready to Begin Your Healing Journey?
If you’re seeking meaningful, human-centered therapy that fosters real growth and transformation, I invite you to reach out. Whether you’re in the Hanford or Visalia, California area, or anywhere in California via teletherapy, I’m here to support you.
Debra Schmitt, LCSW
Reno, NV Therapist / California and Texas Teletherapy
📞 Call: (559) 697-5045
🌐 Find Me on Facebook
“I can choose to rise from the pain and treasure the most precious gift I have – life itself.” — Walter Anderson