If empathy were truly a skill, nothing more, nothing less, than it can be programmed. Not only that, but machine learning should surpass humans, as it has in chess, simply because of its access to larger data sets.
For customer service, when texting with a bot, it seems obvious that empathy works. We know it’s a bot, and yet don’t care—it still works.
But what if your therapist were a robot?
Maybe you’ll say that the parts of therapy that matter most are the parts that can’t be automated. But if you say that and also think empathy is a skill that can be taught (and programmed), you’ll have to say that empathy is not the value proposition of therapy.
Alternatively, you could say that in some cases, it really would make no difference whether a therapist were a bot or not. The human perspectives comes in at the diagnostic level—in assessing whether a particular person needs a bot or a human to help them.
What about the sheer fact that knowing your therapist is a bot would be insulting or offensive? If it were a cultural norm—as in the movie Her—to see a robot-therapist, we’d soon get over it, I presume.
If the thought experiment bothers you, consider why. What can you learn about what differentiates humans from machines by asking if you’d like to talk about your psychological issues with a computer that’s seen and heard millions of cases just like yours and has value-weighted its responses for efficacy, accordingly.
What is Called Thinking? is a practice of asking a daily question on the belief that self-reflection brings awe, joy, and enrichment to one’s life. Consider becoming a subscriber to support this project and access subscriber-only content.
You can read my weekly Torah commentary here.