The Problem With The Therapy Bot
The human therapeutic relationship is more reliable than anything a machine could do for you.
There are only a few days left until this year’s long Easter weekend. Before we talk about bots in therapy, let’s take a moment to reflect on the Catholic confessional. Many of us remember the curtain, the shadows, and the voice on the other side, a voice that belongs to a human being. You don’t see the other person when you confess, and they won’t—in theory—really judge you. The entire setup creates a sense of anonymity, while at the same time preserving the fact that another human being is there, sitting with you, listening, and talking to you. This is truly important, because healing has never been about the exchange of information. It has always been about the unpredictable act of listening and human understanding, guided by knowledge, empathy, and emotions that lead to healing. The confessional screen did not eliminate the other person. It made being with them bearable.
The therapy bot makes human therapists completely obsolete. And in doing so, it solves the wrong problem. But let’s be honest: By 2026, bots will be a reality in our daily lives, and they’ll be very capable. If you look at the therapy field as a whole and try to book an appointment with a good therapist, you’ll find that there aren’t enough therapists. And the ones who are truly good are usually very expensive. Waiting lists stretch for months, and in rural communities, there are hardly any therapists at all.
Given the global mental health crisis, this is not a theoretical problem. It is real because, for example, there is someone living in Kentucky who cannot afford $150 for a therapy session, or a teenager who has never even heard the word “therapist,” or a veteran who would rather talk on the phone at 2 a.m. than sit across from a stranger in an office that smells of lavender, with highly intellectual books lying on a designer desk in front of him. Bots are always there and always working; they never get sick. They work around the clock for you, don’t charge hourly rates, and never get tired.
Is It Better to Talk to A Bot Than to Have No One to Talk to At All?
Many people argue that talking to a bot is better than nothing. And it’s certainly fair to say that AI bots used in therapy are democratizing access to healthcare. The problem is that these people don’t realize who they’re actually talking to during this therapy session conducted by a chatbot. A therapist I know once told me about a session that changed his patient’s life. The patient had gone through his usual routine for a few minutes. He was in therapy, and the therapist did something an algorithm would never do. She simply said nothing for a very long time, until the silence became unbearable and the patient began to cry. He said that for the first time in his life, he felt like a real human being who could be vulnerable while sitting across from another person who refused to fill the silence with a barrage of helpful answers.
The bot would have responded right then and there. Because a bot is designed to react and is optimized for that purpose. There’s no escaping the choice of words; the algorithm is programmed to mirror our language. And if you say you’re sad, it will acknowledge your sadness. And if you say you’re anxious, it will suggest a technique to combat the anxiety. It’s never wrong, like a calculator that never miscalculates.
But therapy isn’t a matter of calculation; the therapeutic relationship is one of the most reliable elements you can have for profound personal results, more so than anything a machine could do for you.
The moment the therapist misunderstands you, the moment they get angry at you, the moment you want to leave and don’t want to book any more sessions with them. This is a human relationship that can also weather conflict. A bot can never do that. The increased risk of therapy with a bot isn’t that it won’t help. The bot will help convey a sense of progress, enough to avoid the hard work of sitting across from another human being, so that it is seen as unnecessary.
What is Everyday Life Like Without Friction?
Consider what happens in your daily life when friction is removed. You order food through an app and lose the conversation with the person at your counter. Or you just stream music with an algorithm, and you lose the act of browsing and the thrill of finding a new record in a record store. Or a text is written instead of calling. Or you call instead of visiting, and each removal here is small, but it is considered “very efficient” in our AI driven world nowadays. And the cumulative effect is that life runs smoothly, but it will feel emptier.
The therapy bot extends this logic to the most intimate domain, which is affecting your inner life. It offers you a relationship without the cost. You do not need to be in crisis to feel this; it is already happening in smaller ways.
• Like you had a hard day, instead of calling a friend, which requires timing, talking and risking that he's also not in a good mood. You open a chatbot, and it will always say the right things. You close the app, and the friend never knows that you were struggling.
• If your child is nervous before a presentation at school, instead of going through this uncomfortable situation with them—which can be tedious and sometimes chaotic, and might lead you to say something wrong— you give them a device with a mindfulness bot that helps them do breathing exercises. Of course, these techniques help once the child has calmed down, but at the same time, something is lost: the memory of a parent who, while not having all the answers, still stayed by their child’s side.
• You're a manager and an employee comes to you, in distress, the company has just rolled out a new “Wellness AI.” And you'd suggest that the employees should try it, and you have removed the burden from yourself. But you have removed the possibility of imperfect listening as a human being and not as a trained professional machine.
In each case, the bot does not fail; it succeeds at that moment and continues succeeding thereafter. It teaches us that the efficient path to feeling better is the one that does not require another person.
So the main question is here: do we believe that healing requires another person?
If the answer is that you do not need another person and you believe that the right information, delivered at the right time and in the right tone, is enough, then the bot is not just acceptable; it is superior. It is the therapist who never burns out and never checks the clock.
But if the answer is yes, if you believe that something happens between two people in a room, that cannot be reduced to language patterns and therapy protocols, then we must be careful about what we are doing here. Not because a bot is evil, but because it is providing therapy at a level appropriate to that individual's situation. And at the same time, it will destroy human practice in a slow, comfortable way, with a replacement that nobody notices at first, until the original therapeutic appointments with a human being seem wrong.
The bot has no wounds, and a thing without wounds cannot say to you with any authority, I understand you.
So the question is here. What kind of people are we becoming if you would rather be understood by something that has never suffered?
Jens Koester is a strategic advisor focused on the structural friction between exponential technology and the enduring patterns of human culture. Through The Human Datum, he provides the intellectual architecture and foresight necessary for leaders to navigate the AI-driven decade with clarity and intentionality.
Read next
How Can You Reclaim Your Human Agency from AI?
Why You Don't Need A Life That Feels Like A Game