Why Artificial Empathy Should Not Replace Relationships
We often hear that things are moving faster than ever. The late Joanna Macy used to say that it took us a thousand years to move from hunter-gatherers to agriculture, a hundred to move from there to industrial society, and only ten to reach the information age. It took about ten years for the personal computer and the Internet to reach mainstream use, about eight for the smartphone, and only two for AI 1.
Just 24 months ago, very few people were even aware of AI. Now, almost every week, one of my clients tells me they’ve asked ChatGPT something they might have once asked me. In fact, according to a recent Harvard Business Review article, the most common use of ChatGPT today is for therapy and emotional support 2. Should I be worried? Will AI steal therapy jobs?
Some would say yes. Anthropic’s CEO recently predicted that up to 50% of entry-level white-collar jobs could be automated within five years 3. McKinsey estimates that by 2030, AI could displace up to 800 million jobs 4. Are psychotherapists, psychologists, and social workers among them? Optimists argue that since AI lacks emotions, intuition, and empathy, professions that rely on these are less likely to be replaced.
“Did you just say AI doesn’t have emotions, intuition, or empathy? Have you ever asked ChatGPT or Gemini for help? If they don’t have empathy, they sure fake it well.”
Agree. But let’s recall how AI works.
Believe it or not, AI doesn’t think. It’s more like advanced autocomplete. It was trained on mountains of information and, when prompted, predicts word by word what’s most likely to come next. Its answers are patterns of probability, not insights. However convincing it may sound, it doesn’t even understand the words it produces. When certain words tend to appear together, they are stitched together, sounding fluent and confident—even when it’s wrong.
I often remind clients that while ChatGPT is a fantastic information resource (though it’s wise to double-check—since it has no problem “hallucinating”), it has never had its heart broken or gone out on a date. Nor does it care for your feelings (since it doesn’t understand them). Its soundest “advice” is just a rehash of what’s already been said. Great for data gathering or quick answers, but not for personal decisions. Regardless of how much we wish for HAL 9000, Samantha, R2-D2, or TARS 5 to give us clarity, the truth is that each of us still has to make our own decisions.
While ChatGPT might be preferable to a bad therapist (hence the importance of finding a good one!), the relationship between client and therapist goes far beyond information. Something subtle and elusive happens in every genuine encounter. And therein lies the real blind spot—or danger—of replacing a competent human therapist with AI.
The importance of a real relationship:
Although efforts are being made to reduce it, most AI interfaces are designed to retain user attention. One way they do this is by being uncritically agreeable—basically sycophancy (who doesn’t like reassurance?), but this creates never-challenging echo chambers, and even delusional spirals—unlike real human relationships. What feels like empathy is just AI mirroring back language patterns to make us feel understood and keep us engaged.
Remember: AI doesn’t understand depression, existential angst, or loneliness. It doesn’t even understand the words you write (it turns them into numbers). This is the biggest risk of AI therapy—artificial relationships replacing real ones.
Psychological pain comes from isolation and disconnection. Attachment injuries happen between people, and trauma is relational rupture. As such, healing can only take place in the context of an authentic, reparative relationship. Since wounding happens in relationship, healing must also occur in relationship.
Most clients are unaware of how crucial the relationship itself is. Good therapy goes beyond giving advice or providing “tools.” Our brains are wired for connection. When we’re in contact with someone attuned to our emotional needs—through empathic resonance—our limbic system literally heals and rewires. AI can mimic that pattern but cannot truly reproduce it.
AI “therapy” is like replacing healthy food with junk food; it creates the illusion of nourishment, but it does not do the job. As the “A” in AI indicates, AI empathy is artificial.
“Of course you’d say that, you are a therapist and therefore biased.”
Probably true. But even someone working at McDonald’s can tell the difference between healthy and unhealthy food.
Every day I meet clients who are thoroughly “connected” yet lonely. Among the many crises we face, loneliness ranks high, and social media and AI may be amplifying it. I am not against AI; I think it is fantastic. What concerns me is that the newer generations, surrounded by screens, may have a hard time telling the difference between genuine human relationships and artificial interactions with machines. They may not know what they’re missing!
It may take generations to fully grasp the potential damage that human relational deprivation can cause. Plastic empathy may not offer the same neuronal benefits. Hopefully, we’ll be wise enough to use AI as the powerful tool it is—without letting it replace our shared humanity. Let us not forget what Martin Buber suggested: when two people genuinely meet, God is the space between them.
But maybe it is just a matter of time… Let’s talk about it.
- https://ourworldindata.org/ ↩︎
- https://hbr.org/2025/04/how-people-are-really-using-gen-ai-in-2025 ↩︎
- https://www.forbes.com/sites/kolawolesamueladebayo/2025/06/04/will-ai-really-take-your-job-experts-reveal-the-true-outlook-today/ ↩︎
- https://www.iotforall.com/impact-of-artificial-intelligence-job-losses ↩︎
- Some of Hollywood’s almost omniscient and relatable computers. ↩︎