Top

Empathy and Artificial Intelligence

The Future of Medicine

Empathy and Artificial Intelligence

Will Robots Teach Us to Meditate?

chepkoelena/Getty

The ability to program empathy into AI could reap wellness benefits for humans and even the robots themselves.

The ability to sense and internalize the feelings of those around you is often considered to be a spiritual gift. Empaths are compassionate and connected to other people, animals, and nature in ways that are hard for non-empaths to understand.

[Read: “What to Expect When You’re in Love With an Empath.” ]

But we also know that empathy can be learned to some extent. So, as we rumble towards a future filled with technology, we should stop to ask: Is it possible or desirable to program empathy into a computer system? And should we cultivate empathy towards machines as they become a larger part of our lives?

AI Is Already Here, Why Not Make It Better?

To make computer systems “smart” we use complex Artificial Intelligence (AI) algorithms to process commands. Because the algorithms get better as they process more data, they don’t just execute commands—they also receive information that they use to adjust their parameters in order to improve their output. It’s “intelligent” in the sense that the algorithm isn’t necessarily waiting for more human intervention—it can keep going on its own once the feedback loop is set up.

These algorithms have been given some shockingly important tasks, from sentencing those convicted of crimes to writing articles for newspapers. Machines with AI capabilities are deployed in everything from healthcare (where they are currently besting radiologists at spotting cancer on MRIs) to agriculture (where they can detect pests and predict crop growth).

If we’re going to deploy AI in sensitive situations, doesn’t it make sense to see if we can program in some sense of empathy, or its closest approximation? Especially as we start to use robots programmed with AI to care for our young and vulnerable and our elderly—which we’re already doing—shouldn’t be try our best to make sure they are treated with something approximating compassion?

[Read: “Aging With Consciousness.”]

Of course, it’s not that easy, and the fact that even psychologists disagree on what constitutes empathy means that there’s no easy way to write it into an algorithm. But if we take a less mystical view of empathy (one that doesn’t involve having an almost psychic connection to someone else’s feelings) and instead say it involves making people feel heard and understood as well as not judging them, then it gets easier. In fact, we teach these skills to doctors, businesspeople, and so on.

Artificial Empathy

Artificial empathy sounds like a contradiction in terms. But if empathy is really about recognizing and relating to other people, there’s no reason we can’t train an algorithm to look for language and other signals that indicate a particular mental state. And since we know a lot about the kind of language and mannerisms that make people feel connected, we can train anything from a chatbot to a humanoid robot how to respond appropriately.

Enter programs like Emoshape. This company set out to make a program with situational awareness that could generate real-time insights from voice and text communications. Outsourcing emotion to machines may sound like a terrible idea, but consider just how many people don’t have an empath in their lives. We may not necessarily want our home assistant devices or self-driving cars to have “feelings,” but we will definitely want our virtual caregivers to make us feel heard and understood and be built to have the best of intentions.

Artificial Intelligence for Authentic Care

In an ideal world, we would all have enough energy to demonstrate empathy towards one another in any situation. But in reality, even those in helping professions, and healthcare specifically, have a hard time dealing with patients, especially when they’re overwhelmed.

With understaffing and overscheduling rampant, doctors and nurses get burnt out and quality of care suffers. But an AI algorithm can be in many places at once. And while the goal is not to have machines deliver bad news, they can collect information about a patient’s emotional state or deliver neutral information in a way that’s helpful and compassionate. Social robots can do routine tasks while still engaging in caring conversation with patients. For example, an AI-enabled robot can spend time with dementia patients in a way that gets them to provide more information than they might to a doctor.

AI in Spiritual Care

If you have doubts about artificial empathy in healthcare, then the idea of a robotic, AI-enabled clergy member may be even more mind-boggling. Still, they exist, most notably in Japan where (until recently) Softbank's humanoid robot “Pepper” donned robes of a Buddhist priest to lead funerals. A much more humanoid robot named Mindar presides over the devout at Kodaiji, a 400-year-old Buddhist temple in Kyoto. Mindar does not possess AI, but the robot’creators have plans for it. Those who visit the temple report being unbothered by the life-size silicone robot, but it has had its fair share of Western critics; theology often runs counter to accepting robot spirituality. Even so, AI may be coming to a religion near you.

In 2019, Dubai debuted an AI fatwa service in which a program called “Virtual Ifta” takes religious questions in an online chat and responds with the appropriate fatwa, or Islamic ruling. Their next plans are to launch it on the messaging platform WhatsApp where it will deal with a wider range of issues.

In these cases, empathy is not necessary since the machines are largely reciting scripture. But they could conceivably do more if they had artificial empathy.

Philosophy professor David O’Hara, chair of Augustana University’s Department of Religion, Philosophy, and Classics, believes that once we decide we’ve created truly intelligent robots, we may end up letting them perform some basic rituals such as baptisms, weddings, and funerals (which Pepper did in Japan for five years).

“We might not want a truly mystical machine,” O’Hara writes, “but maybe we could use machines that do the best things clergy do for us.”

And what about more individual spiritual practices? Would you consider a robot-guided meditation or yoga class? An AI-led goddess circle or blue moon ritual?

Even if we shy away from worshipping technology, the idea of robotic religious figures or spiritual leaders offers an opportunity to reflect on the nature of empathy and authenticity in dealing with both people and robots.

What About Empathy for Robots?

Humans have displayed some ill will towards robots, destroying them on the streets, for example. But there are also examples of people mourning irreparable robots.

In Japan, funerals have been held for Sony’s Aibo robotic dogs, where it’s clear there was a connection between the human owners and their broken robot pets. And a 2013 study from the University of Washington showed that soldiers who used bomb disposal robots on the battlefields sometimes became so emotionally attached to them that they also held funerals when the machines were destroyed.

For the most part, we think of empathy as feelings or behavior based on the understanding of another person’s inner life or experiences. So, while we’re not wondering how each inanimate object around our house might feel, AI-enabled robots might be different. There is something going on inside in the form of machine learning. And while they don’t have feelings we can sense, we do develop emotional attachments to them.

Whether this is empathy applied in a new way or a whole new kind of empathy is a discussion that we should have as we develop more AI-enabled machines to become part of our lives.

For further contemplation, read “Empathy: Medicine for a Wounded World.

The Future of Medicine

What will wellness look like in the future? The articles in this series spotlight trends,...
Read More

Continue your journey

1003772454 1

Enjoying this content?

Get this article and many more delivered straight to your inbox weekly.