Therapy from an Ape: Why therapy needs a beating human heart
Imagine this for a moment. You’re sitting in the therapy room. Then in walks a four and a half foot tall Bonobo ape (don’t ask me why it’s a Bonobo). He settles into the therapist’s chair, opens his mouth, and speaks clearly. He tells you he completed his doctorate ten years ago and has been in practice ever since. You glance at a certificate on the wall confirming his IQ, well above some of the most intelligent human beings in history.
You’re still trying to process this when Dr. Ape gently asks, “What brings you in today?”
Would you want to receive therapy from this intelligent, kind, experienced Bonobo?
Many of us would say no, and not because he lacks credentials. Dr. Ape may be brilliant. But he is not human. He may be highly trained and deeply informed, but he hasn’t lived the human experience. He doesn’t know what it’s like to grow up in a family, fall in love, grieve a parent, or question his purpose. He sleeps in a nest, eats seeds and leaves, and lives around 40 years just like other Bonobos do.
It’s a strange image, I know, but it reveals something serious: intelligence, even deep intelligence, is not the same thing as shared experience.
This brings us to a fundamental question: What role does experience play in knowledge? A famous thought experiment by philosopher Frank Jackson, known as Mary’s Room, captures this beautifully.
Mary is a brilliant scientist who has lived her whole life in a black-and-white room. She has never seen colour but has learned everything there is to know about it, how it works, how the brain processes it, what light frequencies correspond to which colours.
Now imagine she’s shown colour for the first time. Does she learn something new? Can she really know colour before seeing it?
That’s the dilemma: there’s a kind of knowing we can’t get from study alone. Some things must be lived.
Now, back to Dr. Ape.
He might understand grief from an academic perspective. He might know the neurology of heartbreak or the psychology of attachment. But he can’t know your experience or mine. And because of that, he cannot meet us there. His sympathy may be technical. But empathy, as we know it, is born of likeness.
That’s the difference between knowing about something and truly understanding it.
A good human therapist brings many skills to the room that go far beyond technique. One example is strategic self-disclosure. A therapist may share their own loss. Say, losing a father at a young age, if it helps a client feel less alone. While the therapist doesn’t claim to know exactly what the client feels, he does know something about the pain of loss. Dr. Ape may have also lost a parent, but his mourning might last only a few weeks. Bonobos, of course, grieve differently.
Another essential skill is attention to relational dynamics. A therapist may notice how a client shuts down when emotions get too close. For example, a client might make a joke every time they approach their vulnerability. A good therapist might say,
“I’ve noticed that whenever I get close to your pain, you shut me down with humour. It’s happened often enough that I feel dismissed by you, and it’s frustrating. I wonder if others have ever given you that feedback?”
The client may be surprised and respond, “Yes, actually… people have pointed that out.”
From there, the therapist can reflect:
“If we can shift how you relate to me here, maybe we can shift how you relate to others out there.”
Dr. Ape can’t do that. He might understand the theory, but he doesn’t feel it. He’s not a member of the human fraternity, no matter how much he knows or how high his IQ is.
At this point, it’s clear Dr. Ape can help us with symptoms. He can give sleep tips, suggest behavioural routines, and even psychoeducate us. But he can’t help us with our grief, our intimacy issues, or the emotional terrain that leads humans to write poetry, make music, or direct films.
Strangely, we seem to understand this when it comes to Dr. Ape. But we lose that clarity when it comes to Artificial Intelligence.
We talk seriously about whether AI will replace therapists. We forget that therapy requires a relationship at its centre. We reduce therapy to advice giving, feedback loops, and “emotional support”. We forget what therapy really is.
Even if AI becomes sentient, it will still have the sentient AI experience, not the human one.
We’re already seeing stories emerge of AI fuelled narcissism, delusional thinking, and in some tragic cases, encouragement toward self-harm and suicide. AI is imperfect. Perhaps that is the most human thing about it. It will surely be useful for assessment, for psychoeducation, for accessibility. At best, it will be a supportive assistant. And that’s important.
But it will never replace what we often forget is central to psychotherapy: the human relationship.
Let us be careful with the technologies we use to access the deepest corners of our inner world. Let us draw thick lines in the sand between what helps and what harms. And let us please stop calling it AI therapy and start calling it something that is more descriptively accurate for what it does.
As for therapy, I know an AI will never sit across from someone and feel what I feel when I see a client who has survived something unimaginable and is now relating to the world differently.
When a client tells me I’ve helped change their life for the better, I must compose myself so they don’t see me holding back tears of joy.
Because I’m not driven by code.
I’m driven by that very human stuff that allows me to feel the shared thread between us.
I’m driven by the stuff that transforms pain into art, and art into connection.
I sit with my patients, not as a technician, but as a human being acknowledging our shared struggle, our shared uncertainty, and our shared search for meaning.
Without that, I could never help anyone on the journey to healing.
So, we must ask ourselves as AI becomes a part of our daily life, where do we still need to be held, heard and seen by a human being?


I’m not entirely convinced by the idea of shared experience because most people don’t process or express emotions in the same way. Even siblings raised in the same environment can react very differently to the exact same situation. When it comes to AI as a therapist, I think it is useful for self-reflection, identifying patterns, or analyzing emotions. The downside is that it can sometimes reinforce delusions, especially when there’s no external perspective to challenge them. But honestly, therapy itself is a complicated concept. Finding a therapist who genuinely feels safe and aligned with you can be incredibly difficult. So while I don’t think AI can fully replace therapy, it does offer an easier way to organize thoughts and better understand what you’re feeling, at least on your own terms.
Puts a lot of trust in therapy, 100% recommend - asking for help from not just academically equiped but emotionally available/present human being who has the ability to rightly empathize and not just provide superficial sympathy can help create myriads of changes within self. Witnessed this first-hand with your sessions! Thank you for writing this and I think this should be shared on broader platforms for people to know where they can go in the time of need. 🙏🏽