Compassionate A.I. in the Military

April 29, 2022 Artifical Intelligence, Sociocultural Issues, War and Peace No Comments

There are obvious and less obvious reasons — time to re-think the military from the ground up. Compassionate A.I. may lead to this. It should be realized asap.

Soldiers are human beings just like others, also if they have deliberately chosen for military service. Compassion is equally applicable to them.

The obvious

Many soldiers return from the battlefield with PTS(D) symptoms that may linger for a long time. For instance, due to the Vietnam War, 60.000 returned soldiers were diagnosed with Post-Traumatic Stress.

It was shown afterward that how these men were standardly treated resulted in an average worsening of symptoms. There should be no surprise in this.

Nevertheless, or even more than anything else, this indicates the need for proper management before or after. Before parting to serve, many soldiers can use Compassionate coaching preventively to attain what AURELIS is all about in the last two letters: Inner Strength. This helps to retain one’s balance in difficult situations — thus not mentally breaking apart at some point. Compassionate A.I.-based coaching (see Lisa) can roll this out to the many.

Compassion is, of course, not just about being friendly.

It is way more far-reaching, eventually delving into the depth of the human being. One consequence is daring to be vulnerable — so important (and heroic) to every soldier.

In due time, killing without Compassion will be seen as utterly primitive, but that’s a different story.

Still obvious

Still obviously to me, this accords with being Compassionate to the enemy. Of course, one can immediately feel this is challenging. Can one Compassionately kill?

Challenging, but yes, indeed. It may be the best way to kill, whether an animal to eat or an enemy who is otherwise about to kill one’s friends. Compassion lies in ‘knowing’ that the specific killing is the best thing to do in view of circumstances. A crucial element in this is the genuine respect one feels for the killed.

This mental work needs to be done beforehand. It will then aid the soldier-in-action to view the situation from a more substantial vantage point. Thereby also, it relieves the moral doubt that may otherwise lead to PTS/moral injury.

Hating the enemy is not needed for doing a soldier’s job. Besides, being able to kill hatelessly is more efficient. It relieves the inner hesitation at the decisive moment. If this is counterintuitive to you, dear army general, then think again. Scientific proof shows that inner hesitation plays a significant role in battle. If you want efficient soldiers, let them be Compassionate. Compassionate A.I.-based coaching can help. At the same time, it also supports the following points.

The less obvious

The goal of the military is to defend the own people.

The goal of some inside and around the military can go from hardly to pretty corrupt, unfortunately. Still, we start from defense and proceed with the same.

A Compassionate stance frees us to also look at peace wholeheartedly. I’m convinced that almost all people on the planet prefer peace — including you. Not peace at any cost, but nevertheless.

Least obvious, perhaps

Non-Compassionate A.I. poses an existential threat to all, including the own people. The only way to let one’s people survive lies in going for Compassionate A.I. One may turn oneself inside out; it remains entirely clear.

If ‘defense of the own people’ is truly the aim, there is only one way to go.

That’s it, you see?

That’s it.

No war anymore

Eventually, looking at it comprehensively, no war is worth it.

With certainty, either humanity leaves the ground for another organic species to survive intelligently (and maybe read this text someday?), or we go to an era in which no war will ever be waged. That’s the Compassionate culmination point. Compassionate A.I. may help in bringing this closer.

Meanwhile, every Compassionate soldier can push this evolution forward by his mere frame of mind. Additionally, back home, his war experience may embolden him to keep fighting lifelong for the good cause.

This does not lie in pacifism but in a future in which war is no longer needed. Isn’t this congruent with the military’s final aim?

Besides, Compassion also holds technological keys, but that’s a different story.

Leave a Reply

Related Posts

Transparency in A.I.

We should strive to the highest degree of transparency in A.I., but not at the detriment of ourselves. Information transparency In conceptual information processing systems (frequently called A.I.), transparency is the showing of all data (information, concepts) that are used in decision making of any kind. In system-human interaction, the human may ask the system Read the full article…

Mental Growth Beyond A.I.: Our Human Edge

This may become increasingly important in the future when super-A.I. can, in principle, do almost anything humans do nowadays. Of course, the issue is already crucial and has always been. Mental growth is intimately connected to meaningfulness and Compassion. This highlights the AURELIS commitment to growth, not as an optional luxury but as a fundamental Read the full article…

A.I., HR, Danger Ahead

Many categorizing HR techniques are controversial, and rightly so. There is little to no scientific background. Despite this, they keep being used. Why do people feel OK with this? In combination with A.I., it is extremely dangerous. People feel a longing for control. Naturally. Being alive is about ‘agency,’ which is about wanting control. Without Read the full article…

Translate »