Can I use ChatGPT as a therapist?
/It was an overcast summer day in Northern California at the river with a group of acquaintances when I found myself at the center of a conversation about using ChatGPT as a therapist. As the lone therapist in the group — mostly lawyers — I had the lonely task of defending why a human therapist is more useful than a chatbot. To my surprise, almost everyone admitted that at one time or another they had used ChatGPT as a substitute for a human therapist.
I want to point out one critical difference I don’t think most people consider, based on my observations and conversations on this topic. Many people seem to operate under an unconscious assumption that someone else has the answer. “I want to speak with someone objective, a neutral party” is something I hear a lot. It’s true that talking to a therapist is different from talking to a friend or a partner; however, there is an underlying assumption that a different perspective exists that is better than one’s own beliefs. That’s partly true, but it’s incomplete. Therapists do possess knowledge, authority, and power by virtue of their training and experience, and some of that authority is granted by the patient. In fact, the therapist must be granted this authority; otherwise a patient wouldn’t seek therapy in the first place.
What I notice with people using ChatGPT as a therapist is that this power, authority, knowledge, and perceived objectivity are almost completely handed over to the chatbot. The video linked here discusses the dangers of this, but I want to point out another subtlety that’s less talked about. Using ChatGPT or any chatbot as a therapist is ultimately an exercise in self-objectification. The chatbot will more or less give everyone the same answer to any given question. It might alter its response slightly based on previous conversations or details you provide, but it knows very little about how you actually operate or what makes you unique. It doesn’t know how your mind is organized or why you think the way you do. If you tell it a situation and ask for advice, it will take your words at face value and give a concrete answer. It won’t account for why you are asking that question — perhaps there are other questions embedded in the original one. You and someone else asking the exact same question are likely operating from completely different unconscious beliefs and assumptions. Shouldn’t you deserve a different answer that is just for you?
The assumption of the other’s “objectivity,” whether therapist or chatbot, not only surrenders one’s own authority but obscures one’s unique subjectivity as the person seeking help. In other words, by projecting objectivity onto the other, the person is objectifying themselves.
Perhaps in the near future chatbots will be better than well-trained therapists at picking up subtleties in tone, intonation, word choice, body language, and have access to the vast library of research and literature that currently sits behind paywalls accessible mainly to professionals. Until then, and even after, I recommend a human therapist. And no, I’m not objective.