×

Telling Secrets To A Machine

I know people who are using ChatGPT as their personal therapist, and they always seem amazed by the quality of Chat’s answers. I’m guessing a great percentage of the questions posed to Chat go something like this: “This guy I like never called me back, and so I texted him, and he wrote a flat answer back with no emojis, and so I tried to call him like two days later, and he never picked up and never called back, and I don’t know what to do. Do you think he likes me?”

And Chat will answer with something like this: “No.”

Chat is definitely not the “Girl, give him another chance” guy.

Here we have one of the most intelligent computer programs ever to exist, and are people asking questions about quantum physics or ancient archaeology? No. When it comes down to it, most people want to know about relationships. In interviews and recent studies, teenagers say they’re increasingly treating AI less like a tool and more like a companion–something that listens, gives advice, and, in a way, keeps them company.

It’s not surprising. ChatGPT neatly sidesteps three of the biggest barriers to mental-health support that have existed forever: it costs nothing, it’s easy to access, and it’s always available. When therapy comes with long waiting lists, eye-watering fees, and layers of bureaucracy, it feels a little precious–and frankly a bit out of touch–to insist that no one should ever turn to AI instead.

There’s also early evidence that this isn’t just a placebo.

The first clinical trial using an AI program to support people with mental-health disorders reported real improvements. Patients working with a tool called “Therabot” showed meaningful reductions in depression and anxiety, along with fewer body-image concerns.

ChatGPT has arrived at exactly the right moment for a perfect storm. The cost of living is crushing, access to mental-health care is limited, and typing a prompt into an AI is free, immediate, and extremely patient. The reality is that therapy isn’t always accessible, even when it’s desperately needed. AI isn’t a replacement — but it has become a stand-in. That doesn’t make it good or bad. It makes it complicated.

The broad consensus in studies is cautious. AI shows promise, but it is nowhere near ready to function as a true mental-health intervention — if it ever should. The enthusiasm tends to outpace the evidence.

So the more interesting question may not be whether people should be using ChatGPT, but whether using it is moving us toward a better quality of life and healthier, real-world relationships — or quietly pulling us away from them. Do we really need another screen-based solution to manage our loneliness? Or do we want to live in a society that’s more connected, more present, and more supportive in real life?

A woman sits on her couch at midnight, typing into ChatGPT about a fight with her sister. She gets a calm, thoughtful response — measured, validating, clear. She feels better. Relieved. Understood. And then she closes her laptop and never calls her sister. The AI helped her process the feeling, but it also gave her an exit from the harder work of human connection. The question isn’t whether ChatGPT helped in that moment–it did. The question is whether it moved her closer to relationship, or quietly replaced it.

There’s a balance to be struck. We should be thoughtful about relying on a tool that consumes enormous energy, borrows from human creativity, and risks becoming a substitute for personal growth rather than a support for it. In an ideal world, maybe we wouldn’t need ChatGPT at all.

But when the conversation starts to sound like blame–aimed at individuals for turning to AI — it’s worth being suspicious. Why the overreaction? That kind of finger pointing has a way of diverting attention from much larger forces at play. It reminds me of how corporations popularized the idea of a “carbon footprint” to shift responsibility away from industrial greed and onto individual behavior.

Instead of scolding people for using AI, perhaps we should be angrier at the system that made it feel necessary — a system so expensive, fragmented, and isolating that people end up confiding in a language model because there are no better options.

That — not the technology — is the real problem.

Starting at $3.50/week.

Subscribe Today