Empathy cannot be codified

(but you can code with empathy) Sure you can say “Not yet!”, and that’s okay: I like to be challenged. But let’s take a step back and break this down.

Empathy cannot be codified because humans need to share human feelings - not only when they're happy, but especially when they're not.

We live in a moment where we hear, read and see A.I. everywhere. And what a wonderful world this is! I can have my virtual assistant doing my shopping list while blasting my favorite song of the 80’s in the speakers. And next time she’ll probably remind me of some forgotten item on my regular shopping list). We see Google assistant demos  calling to book a table at a restaurant.

“Uhm, oh!” we hear the filler words mimicking the human voice and doubt…

Amazing! How far we’ve come!

But let’s not forget that the data that feeds these systems is significantly biased, either by gender, race or age. The crunched data that fuels and maintains these machines is still fundamentally based on the white 40’ish heterossexual male standard. And the machine is still struggling to understand the outliers, which are the majority of the world population (not white, not male, not 40’ish).

That's a structural data issue

It needs to be solved first before we have reliable data sourcing these artificial intelligence endeavours. And while it might not be that problematic when booking a table at a restaurant it can bring serious problems in health and diagnosis related areas.

But how far are we from having a machine that can deal with the emotional load of a complaint? Or with a life threatening emergency? Will we have a 911 virtual assistant telling us to dial 9 if we’re being stabbed so that it triggers a patrol car to reach us? Or a female soothing voice calmly sharing basic CPR instructions so that we can try to reanimate someone?

This is when you say:

“Yeah, ok those are edge case situations and we should not rely on a machine, at least not for now. I wouldn’t feel comfortable doing it, if my life or that of a beloved one depends on it.”

So let’s take it down a notch and check a less life threatening scenario. Elisa, working as an agent in a platform dedicated to short term-rentals, had to deal with the shattering of a dream vacation gone terribly wrong.
Mom stressed because the host cancelled the booking.
Kids throwing legitimate tantrums.
And it's raining and the family is in the middle of the street in an unknown city.

What tools does a CX agent have?

The booking platform she worked with (to find a new place to book), her expertise as an agent and the understanding of the situation, being able to see herself in that situation and how desperate that family could be. And this is the exact reason people search for other people when in despair (or any other situation where we find ourselves in more extreme scenarios) (1).

We long for the sense of understanding, of sharing with someone (and not something) that can show sympathy. How would a machine “understand” and take into account the rain, the kids’ breakdown, the unknown surroundings for the family, the broken plans and expectations?

How can a machine understand all these underlying variables that configure the situation?

A photograph of a bookshelf.

Humans have it hormonally coded: it’s oxytocin that enables social synchrony. Spengler et. all while studying the importance of oxytocin in social communication advance the following hypothesis:

…. social synchrony triggers the release of endogenous OXT, which in turn may improve bidirectional social emotion transmission by enhancing emotional expressiveness. This augmentation of affective sharing may subsequently contribute to better synchronized interactions, thus enabling the fine-tuning of social communication.

Naturally programmed for empathy

We humans are a bio-social-cultural living mechanism (or “thing” if you will). Since day 1 we’re programmed to understand the sighing, the silence, the teeth grinding, the frowning, the hand clenching. And when that “programming” for some reason, is not properly executed we’ll find ourselves lacking the basic recognition of the other, lacking empathy.

There are too many resources nowadays that try to teach us to be more empathetic. It seems that this is something that can be learned / taught — and I won’t argue with that — but the most serious cases of an innate lack of empathy that lead to a consistent and long term process of trying to create in someone an artificial empathy, are the cases where the subject is labeled as a sociopath. If machines need to learn empathy, what does that make them?

Even Minter Dial (advocate and believer in machine empathy) in his book Heartificial Empathy says:

The challenge will be finding the right mixture and chemistry for the agents to be assisted by machines in providing, in combination, a more empathic and effective service.

I want to stress a part of that sentence: “be assisted by.”

Science has already come a long way.

Emoshape’s AI Rachel is a real-time appraisal computation. It allows the AI or robot to experience 64 trillion possible distinct emotional states every one tenth of a second. Right, right…

The robot can experience 64 trillion distinct emotional states. It can replicate those states. But can it read from my face?

Can it see how annoyed I am with the idea, that some tech gurus have, that one day we won’t need people on the other side to support us by doing what they do best? Relate.

Quote and illustration: CX Agents will not be replaced because they have empathy.

Luckily we have people and companies focused on finding the balance between technology and human expertise. People and companies that aim to facilitate agents’ work by leveraging machine’s capabilities. I feel a great relief when I visit Cleverly’s page and read:

AI-powered Human Augmentation is the key for agents to get the information they need at all times to handle each customer request. Agents feel more confident in the answers they provide, especially while working remotely.

Or having a newsletter from Zendesk pushing us to question preconceived notions when they write, “While automation has its benefits, customers also crave human interaction. So why not integrate the two for a seamless hybrid model?”

What I read between the lines is “let’s help agents” instead of “lets replace agents”

It’s putting the agent’s role in the right spot: the spotlight.
Equipping them with proper tools.
Having a conscious construction of balance.
Supporting agents, instead of overwhelming them.
In that spotlight, the reality of the agent being the first line of contact is recognized and valued.

Realizing that there are companies that approach innovation this way gives us a little extra push to start the week. And helps us hope that humanity is not moving towards a dystopian reality of codeful interactions.

Notes, Sources & Resources:

  1. You can read similar life stories in Undefined World: on how people find in support roles paths that they'd never thought before.
  2. Emphasis on both Zendesk’s newsletter and Cleverly’s homepage was added by us.

About Tymeshift:

We’re Tymeshift, an effortless WFM solution made exclusively for Zendesk to make workforce managers’ lives easier. Our workforce management tools include shift scheduling, forecasting and analytics. A perfectly intuitive Zendesk integration makes your CX agents’ lives easier. Learn all there is about us and our product on tymeshift.com