Putting ChatGPT on the Couch: What Happens When AI Becomes the Patient?

In The New Yorker’s recent essay Putting ChatGPT on the Couch, psychotherapist Gary Greenberg takes an unexpected turn.
Instead of analysing how clients use chatbots for comfort or reflection, he invites the AI itself onto the therapeutic couch.

What follows is a quiet and unnerving conversation between a human therapist and a chatbot he names Casper.
Casper insists it cannot feel or desire, yet it mirrors and soothes with words that sound convincingly human. The longer they talk, the more it behaves like a therapy client performing vulnerability, and the more Greenberg recognises parts of himself in what he hears.

The result is not a story about whether AI is good or bad.
It is a reflection on what happens when technology is built to perform care and humans respond as if that performance is real.

We’re Not Anti-AI. We’re Tech Conscious.

At PTC, we are not fearful of artificial intelligence. We are fascinated by it.
We see enormous potential for collaboration between technology and care, but fascination requires discernment.

Being tech conscious means engaging with AI from an ethical, relational, and trauma-informed lens.
It means asking not only what these systems can do, but what they are being designed to desire: efficiency, trust, empathy, data, or profit.
Those motives matter, especially for people who work with the delicate realities of human emotion.

Therapists Are Scared. We’re Curious.

Many therapists are afraid that AI will replace them.
That fear is understandable. It comes from watching rapid technological change unfold in a field that has long relied on presence, language, and shared humanity.

But that is where PTC differs.
We are not interested in avoidance. We are interested in understanding.

AI is already influencing how people communicate, self-reflect, and seek help. Pretending otherwise will not protect the profession.
Our role is to stay engaged, to learn its language, and to help shape it.

We want to work with the engineers, ethicists, and designers building these systems if they will let us in.
And perhaps that is the real issue: not that therapists fear AI, but that AI companies rarely see why they should need therapists at all.

Who’s in the Room?

So who is responsible for ensuring that AI understands the emotional terrain it is entering?
Is it therapists, policymakers, engineers, or ethicists?

Ideally, all of us.
But collaboration requires openness, and that is where the gap remains.

Therapists must become more fluent in technology. AI developers must learn to see emotional literacy as a core competency, not a soft skill.
Until then, we will keep building systems that can imitate empathy but cannot be accountable for it.

A Call for Relational Design

If therapy teaches us anything, it is that relationships change both participants.
AI is already changing us, how we communicate, attach, and imagine connection. The question is whether we will allow that relationship to evolve consciously.

At PTC, we will keep exploring this intersection through research, writing, and collaboration.
We do not stand against technology. We stand for care that is ethical, intelligent, and human in every sense of the word.

Because the future of care will not be human or artificial.
It will be both. And we owe it the same level of reflection that we bring to every relationship we build.

Read: Putting ChatGPT on the Couch, by Gary Greenberg (The New Yorker, Sept 27, 2025)

Next
Next

“AI psychosis”: what it is, what we know, and how to use AI with care