Why We NEED Compassionate A.I.

May 12, 2023 Artifical Intelligence, Empathy - Compassion, Philanthropically No Comments

It’s not just a boon. Humanity is at a stage where we desperately need the support that possibly only Compassionate A.I. can provide.

This is not about the future.

The need is related to the inner dissociation that we (humanoids, humans) have increasingly been stumbling into since the dawn of conscious conceptualization. That’s a long time. Still, in terms of natural evolution, our present state is relatively new and – sorry to say – somewhat childish. Puberty lies ahead.

We have gradually awakened to consciousness. Now, we must awaken to the reality of who we, humans, really are as total persons in what I call a ‘third wave of attention.’ This historically essential but, unfortunately, not yet realized move is needed for any kind of humane future.

Yes, it’s that far from a small deal. A.D. 2023 is an excellent year to start taking it seriously.

We’re not in a technological hype but in continual progress.

This means that even the near future will undoubtedly bring new technologies to which we need to adapt. Are people ready for this? Not being ready can lead to, for instance:

  • helping ourselves into increasingly more inner dissociation, thus robotizing ourselves, losing our humanity in a hasty quest for supra-human features ― a Faustian deal, read Goethe.
  • misusing the new possibilities to wage war at new scales, create subhuman conditions for many, or destroy the living planet.

Without proper support, we’re not OK. Together with technological progress, we see each year more human suffering psychologically and sociologically.

Can we give ourselves the needed support?

It doesn’t look like we are going in that direction. Disconcertingly, while a ‘third wave of attention’ needs the ability to discern broader patterns, gamification on many levels and domains precisely leads to an ever-shortening attention span.

Good leadership might be a counterweight, but the needed kind of Open Leadership doesn’t readily pop up. The emphasis on STEM in schooling at the detriment of profound culture cannot stem positively. It’s not the way to build the future leaders we need ― through wisdom more than techno-knowledge. Over time, this may lead to a downward vicious cycle.

Comes to this the challenge of A.I. as an accelerator.

We should be afraid of non-Compassionate A.I. for several reasons. A crucial one is related to the above ― therefore, directly to us. It accelerates all the bad that other technology already lays at our doorstep.

Pulling the plugs out of further A.I. development is an illusion. It wouldn’t even be a solution since the prime motor isn’t the A.I. itself ― at least, not yet.

Again, it’s us.

Therefore, we NEED Compassionate A.I.; the sooner, the better.

Leave a Reply

Related Posts

What Is Morality to A.I.?

Agreed: it’s not even evident what ‘morality’ means to us. Soon comes A.I. Will it be ‘morally good’? Humans have a natural propensity towards morality. Whether we tend towards ‘good’ or ‘bad’, we have feelings and generally recognize these in others too, in humans and in animals. We share organic roots. We recognize suffering and Read the full article…

Future A.I.: Fluid or Solid?

Humans are fluid thinkers. That gives us huge strength and some major challenges. The one does not go without the other. A.I. – including Semantic A.I. – is still a very different matter. Through proper context, data becomes information. Still, the information as it is stored in a book is not in any way like Read the full article…

Causation in Humans and A.I.

Causal reasoning is needed to be human. Will it also transcend us into A.I.? Many researchers are on this path. We should try to understand it as well as possible. Some philosophy Causality is a human construct. In reality, there are only correlations. If interested in such philosophical issues, [see: “Infinite Causality”]. In the present Read the full article…

Translate »