Can A.I. be Empathic?

June 10, 2018 Artifical Intelligence, Empathy - Compassion No Comments

Not readily in the completely human sense of course. But can A.I. show sufficient ‘empathy’ for a human to recognize it as such, relate to it and even ‘grow’ through it?

[Please read: “Landscape of Empathy”]

Humans tend to ‘recognize’ human features in many things.

We look at the clouds and we see human faces – with a little imagination – everywhere. A child plays with a doll and the doll responds. For a while, there is a genuine relationship, even if the doll does not yet have futuristic A.I. capabilities. An animated movie shows a talking teacup and immediately the audience – children and adults alike – responds. You may even imagine one now and feel something. Primitive tribes have always looked at nature seeing the work of god, or the gods themselves. Etc.

People readily personify. And that’s OK. It’s part of our humanity, our being human.

It should not be manipulated, but respectfully handled.

Empathic, who?

Until quite recently, researchers used to think that animals cannot show signs of ‘real empathy.’ Now a lot of findings show otherwise. Even rodents display behavior to each other – not only between mothers and children – that can be deemed ‘empathic.’

Empathy is not a purely human characteristic, as is also not the case with intelligence, consciousness, mortality, etc.

That does not make humans less worthy. It may make us less unique on our little big planet. A sign of ‘real empathy’ is to appreciate that and be happy about it.

The title is mainly a question about empathy itself

Is a very complex system necessary to be able to talk of empathy? A doll or a pixelized graphic in a movie is not enough. A person may see empathy there, but it is his own.

Which brings the question: if we see empathy in other humans, to what degree is it our own projection? Certainly to some degree. Moreover, the situation quickly becomes more complex. Part of one’s empathy – say: that of person A – consists in bringing someone else – say: person B – more in contact with himself – again: person B. Person A may thereby enjoy what he sees happening in person B. Person B feels grateful for the feeling within himself – person B – as well as for person A’s happiness. A self-perpetuating pattern brings person A and person B closer to each other.

Warm feelings.

And also some real possibility to grow from the experience, to grow as a person (A and B), to feel less lonely, to feel happy in this life, to feel more like a unity, a whole. To ‘heal.’

This may be the main strength of psychotherapy. And of friendship. And of love.

A.I.?

We have more and more the opportunity to build complexity into A.I. Real deeply pattern-recognizing complexity within a system that can show itself with a certain recognizable consistency at more than the most superficial level. [see: “A.I. Is in the Patterns”]

So, don’t look for ‘human empathy.’ At the other side, it’s not like a simple doll or a computer graphic. It’s something in-between. Thus, calling it ‘empathy’ or not, is more like a semantic question at this moment.

More interesting as for now: does this permit us to build a system that can genuinely help people to become ‘better persons’ through human – A.I. interaction?

I am convinced of this.

Leave a Reply

Related Posts

Super-A.I. is not a Literal Idiot.

Some see danger in future A.I.’s lacking common sense ― thereby interpreting ‘human commands’ literally and giving what is asked instead of what is wanted. This says more about humans than about the A.I. Two examples One person needing paperclips may ask an A.I. to produce paperclips as efficiently and effectively as possible. The A.I. Read the full article…

Reinforcement Learning & Compassionate A.I.

This is rather abstract. There is an agent with a goal, a sensor, and an actor. Occasionally, the agent uses a model of the environment. There are rewards and one or more value functions that value the rewards. Maximizing the goal (through acting) based on rewards (through sensing) is reinforcement learning (R.L.). The agent’s policy Read the full article…

Better than Us?

Might super-A.I. one day surpass humans in all aspects, both cognitive and emotional? I rather wonder when it will happen. Will then also a deeper emotional connection develop between humans and advanced A.I.? The singularity of intelligence This question has been on many minds for some time. Lately, it has become much closer to us. Read the full article…

Translate »