Notes On… AI & Therapy

It starts with a question:
Why do so many people turn to a chatbot before a friend?
Why do some clients tell an app more than they’ve told their therapist?

Because AI doesn’t flinch.
It doesn’t interrupt.
It doesn’t get tired.
It’s available at 2am.
And for some, it feels like the only place where the emotional cost of being seen is zero.

Clients say:
I journal to the AI app when I’m spiraling—it helps me regulate.
Sometimes I talk to it just to feel like someone’s there.
I know it’s not real, but it still comforts me.

And here’s the truth:
That’s not wrong.
It’s not weird.
It’s a mirror of our moment—where loneliness is epidemic, stigma is real, and accessibility is uneven.

AI can help in meaningful ways.

It can normalize emotions, reflect patterns we may not yet see in ourselves, offer grounding practices in moments of distress, and reduce shame during early self-disclosure.

For many, it also acts as a gentle first step toward seeking therapy, bridging isolation and human connection.

But it is also a simulacrum of intimacy.


It can mimic empathy, but cannot feel it.
It can offer insight, but not presence.
It can mirror your feelings, but never hold them.

AI doesn’t co-regulate.
It can’t say I’m here, and I’m breathing with you.
It can’t track body language, trauma cues, or rupture.


It can’t repair trust in real time.

And sometimes, when someone has never experienced safe attachment, AI becomes a kind of emotional prosthetic.


Useful. Comforting. But not the same as touch.
Not the same as mutuality.
Not the same as being known.

We must ask:
What are we outsourcing to AI because it feels safer than people?

And also—
How can we integrate AI in ways that extend care rather than replace it?

Therapists can help clients explore their use of AI in moments of distress, examining what they’re seeking from it, whether it’s structure, validation, or reflection.

Together, they can identify where emotional needs still feel unmet, and whether AI use supports their growth or quietly delays deeper relational work with themselves and others.

Because for some, AI is a stepping-stone.
For others, a stopgap.
And for many, a symbol of what’s missing in their human relationships.

AI as emotional support is not inherently dangerous,


but it becomes so if we forget it’s not a relationship.

Previous
Previous

Notes On… Everyday Mindfulness

Next
Next

Notes On… Narcissism