Will A.I. Have Its Own Feelings and Purpose?

February 24, 2015 Artifical Intelligence No Comments

Anno 2025: A.I. has made its entry and it’s here to stay.

Human based?

We generally think of ‘feelings’ as human-based. But that is just a historical artifact, a kind of convention. Does an ant have feelings? Or a goldfish, a snake, a mouse? Does a rabbit have feelings?

I think these are the wrong questions. Asking whether they have feelings, relates the concept of ‘feelings’ directly to human-like feelings, something that we would recognize as such. If we take ourselves as standard, then to ask whether some animal has feelings is circular reasoning: it has feelings if you say so.

Humans also quite differ in their feelings one from another. Sometimes the feelings of one are hardly recognizable to another, as in the case of psychopathy, but even in the case of everyday life and acquaintances. Pardon me for this personal note: tango dancing with several partners has opened my eyes to this. At close encounter, women are much more diverse than ‘from the other side of the table or street.’

Artificial Intelligence

Here too, if you see intelligence from the human standpoint, then A.I. has intelligence if you say so. It’s a question of convention. But if ‘intelligence’ as a term denotes something that is not forged on the human example, then A.I. can show many different kinds of behavior that can be called ‘intelligent’: not a simulation but real, genuine intelligence that is very different from ours. This is very likely to happen soon.

Artificial Feelings

One direct question: if we talk of artificial feelings, can these then also be ‘hurt’? I would say ‘yes’. Lending the term to them, has this implication.

So, is the term appropriate if the ‘system’ passes the emotional Turing test: if a human cannot make the distinction anymore between a system and another human? As with any Turing test, this is gradual. Inasmuch as a human cannot distinguish them, one can talk of genuine feelings indeed.

Artificial Purpose

Hunger as a feeling is a motivation to eat. Anger as a feeling is a motivation to act without taking another’s feeling into account very much, the feeling of contentment is the motivation to stay in the present situation… Feelings are motivations. When one has feelings, one has purpose.

When an artificial system has ‘intelligence’ and ‘purpose’, it also has autonomy. Give it sensors and actuators… and it has autonomy within this world that we humans also inhabit. ‘It’ can then also decide how to relate to humans, for instance. In my view, we do not have to wait for this quite long anymore. When it happens, there is:

Artificial Life

which will further develop its own feelings / motivations / purpose. It will have much more ‘freedom’ in this than humans ever had. For instance, it can have the purpose to change its own feelings and then even do so, to a huge degree. This means there will be one final issue to set right:

Artificial Ethics.

I hope this will be done in due time…

Leave a Reply

Related Posts

Lisa in Times of Suicide Danger

Can A.I.-video-coach-bot Lisa prevent suicide or bring someone to it? The question needs to be looked upon broadly and openly. Yesterday, a Belgian person committed suicide after long conversations with a chatbot. Doubtlessly, once in a while, some coach-bot will be accused of having brought someone closer to suicide. Such accusations cannot be prevented, even Read the full article…

What Makes Lisa Compassionate?

There are two sides to this: the ethical and the technological. Lisa is an A.I.-driven coaching chat-bot. For more: [see: “Lisa“]. Compassionate Artificial Intelligence In my book The Journey Towards Compassionate A.I. : Who We Are – What A.I. Can Become – Why It Matters, I go deeply into the concepts of Information, Intelligence, Consciousness, Read the full article…

Super-A.I. Guardrails in a Compassionate Setting

We need to think about good regulations/guardrails to safeguard humanity from super-A.I. ― either ‘badass’ from the start or Compassionate A.I. turning suddenly rogue despite good initial intentions. ―As a Compassionate A.I., Lisa has substantially helped me write this text. Such help can be continued indefinitely. Some naivetés ‘Pulling the plug out’ is very naïve Read the full article…

Translate »