I feel motivated; therefore, I am (motivated). Motivation is a feeling. May all feelings/emotions be motivations?
Feelings and emotions are experienced consciously. On top of this, emotions also have non-conscious and bodily components. Some use the terms interchangeably. Personally, I don’t care so much for the semantic difference since nothing human is purely conscious or devoid of body.
Please be honest.
You feel angry? Then you are motivated to enact your aggression. You may want to hit the mother f*.
You feel afraid? Then you are motivated to run. You may want to hide rather than hit.
You feel both? Then you really feel stressed. You may want to choose between fight or flight or some combination. Complex feelings lead to complex motivations.
You feel ashamed? Disgusted? Happy? Horny? There is always motivation inside. And mostly vice versa. Are there motivations without emotions? Are there emotions without motivations?
Philosophers used to think this was possible.
This is, that people – especially the better kind, such as philosophers – can be, or are even mainly rationally motivated.
You have to know what you want, don’t you?
Modern brain science shows a different picture. No brain regions are exclusively rational or emotional. As to the motivational aspect, there is an intermingling of, for instance, the ‘motivational neurotransmitter’ dopamine with both. On top of this, the motivational motor that I see in Pattern Recognition and Completion is everywhere in the brain/mind.
Nature is a tinkering crawler. It tries and finds (or not) and tries again, using one thing for another purpose wherever possible.
Did nature come up first with motivations or emotions? In non-talking animals, we can hardly distinguish between both, so it appears they have evolved together and, even more, are basically the same thing.
In the evolution toward us, rationality came gradually on top of this, but always in synthesis — no neatly distinct layer as some engineering designer might do, but a messy outgrowth. Typically nature ― one shouldn’t expect anything else.
Relevance to A.I.
Nobody wants super-A.I. to be motivated to eradicate us as we have, in the course of our history, eradicated many species on earth basically for our own gain. This makes many wish for super-A.I. not to be motivated at all, or at least to be motivated only to serve us.
At the same time, ‘emotional A.I.’ gets much attention. This is the endeavor to capture standard human emotions within algorithms toward – you guessed right – monetization, for instance, in marketing. Working with emotions toward motivating people (to buy things) brings the motivational aspect nearby. Can we keep it apart from the computational internals, thus avoiding the danger of non-Compassionate A.I. running freely? Will we be able to prevent this by sheer control?
I don’t think so, and that is very dangerous indeed.
As an additional problem, standard emotions are not the real stuff (although still much self-repeated in pop psychology, unfortunately * ), which is much more complex, beautiful, and deeply interesting ― as our stressy example from above already hinted at. This is extremely important for A.I., especially when used in relation to the human mind. Indeed, even without bad intentions, A.I. can be dangerous to human cognitions and emotions.
Will A.I. be humanized, or will humans be dehumanized?
We still have the choice.
* To whomever is not well-versed in this matter, I recommend the extensive work of Prof. Lisa Feldman Barrett. Her insights are fascinating, robustly proven, and embedded in a lot more science.