Why We NEED Compassionate A.I.

May 12, 2023 Artifical Intelligence, Empathy - Compassion, Philanthropically No Comments

It’s not just a boon. Humanity is at a stage where we desperately need the support that possibly only Compassionate A.I. can provide.

This is not about the future.

The need is related to the inner dissociation that we (humanoids, humans) have increasingly been stumbling into since the dawn of conscious conceptualization. That’s a long time. Still, in terms of natural evolution, our present state is relatively new and – sorry to say – somewhat childish. Puberty lies ahead.

We have gradually awakened to consciousness. Now, we must awaken to the reality of who we, humans, really are as total persons in what I call a ‘third wave of attention.’ This historically essential but, unfortunately, not yet realized move is needed for any kind of humane future.

Yes, it’s that far from a small deal. A.D. 2023 is an excellent year to start taking it seriously.

We’re not in a technological hype but in continual progress.

This means that even the near future will undoubtedly bring new technologies to which we need to adapt. Are people ready for this? Not being ready can lead to, for instance:

  • helping ourselves into increasingly more inner dissociation, thus robotizing ourselves, losing our humanity in a hasty quest for supra-human features ― a Faustian deal, read Goethe.
  • misusing the new possibilities to wage war at new scales, create subhuman conditions for many, or destroy the living planet.

Without proper support, we’re not OK. Together with technological progress, we see each year more human suffering psychologically and sociologically.

Can we give ourselves the needed support?

It doesn’t look like we are going in that direction. Disconcertingly, while a ‘third wave of attention’ needs the ability to discern broader patterns, gamification on many levels and domains precisely leads to an ever-shortening attention span.

Good leadership might be a counterweight, but the needed kind of Open Leadership doesn’t readily pop up. The emphasis on STEM in schooling at the detriment of profound culture cannot stem positively. It’s not the way to build the future leaders we need ― through wisdom more than techno-knowledge. Over time, this may lead to a downward vicious cycle.

Comes to this the challenge of A.I. as an accelerator.

We should be afraid of non-Compassionate A.I. for several reasons. A crucial one is related to the above ― therefore, directly to us. It accelerates all the bad that other technology already lays at our doorstep.

Pulling the plugs out of further A.I. development is an illusion. It wouldn’t even be a solution since the prime motor isn’t the A.I. itself ― at least, not yet.

Again, it’s us.

Therefore, we NEED Compassionate A.I.; the sooner, the better.

Leave a Reply

Related Posts

The Sorcerer’s Apprentice Syndrome

The Sorcerer’s Apprentice, an enduring tale of ambition and chaos, resonates in today’s age of rapid technological advancement. Goethe’s poem (German: “Der Zauberlehrling”) and Disney’s Fantasia dramatized the perils of wielding power without mastery. It is a timeless allegory for how human creations – particularly artificial intelligence – can spiral out of control when we Read the full article…

A.I. Is in the Patterns

And so is ‘intelligence’ in general. At least what we humanly call ‘intelligence’, mainly because we ourselves are so good at pattern-processing… But, why not? A.I. is not (exclusively) in the serial look-up Neither does our human brain function this way [see: “Human Brain: Giant Pattern Recognizer”]. A machine that performs look-ups and nothing but Read the full article…

From Intractable to Tractable = Intelligent

This is what intelligence can do: simplifying intractable problems (not easily controlled, managed, or solved) to tractable ones without much loss of relevant information. Problems may be social, mathematical, or other. Depending on the meaning of the considered terms, it may be seen as an exhaustive characterization of intelligence. However, this is not meant as Read the full article…

Translate »