Future A.I.: Fluid or Solid?

March 24, 2021 Artifical Intelligence No Comments

Humans are fluid thinkers. That gives us huge strength and some major challenges. The one does not go without the other. A.I. – including Semantic A.I. – is still a very different matter.

Through proper context, data becomes information.

Still, the information as it is stored in a book is not in any way like the information that we ‘store in our human memory.’

Likewise, smart data in a present-day A.I. system is not in any way like data stored in the human brain/mind. For starters, human ‘software’ = hardware. Mind IS brain ― at least, to a substantial degree.

Likewise, human intelligence is not in any way like so-called intelligence in present-day A.I. Mainly, human intelligence is much more fluid.

Even human solidity in thinking is embedded in fluidity.

In humans, every concept, every piece of solidity, every thought and feeling is the result of fluidity. Our stream of consciousness is a stream of non-conscious fluidity of which only parts break through the surface of conscious awareness. This way, we think parallel and distributed. [see: “Patterns in Neurophysiology”]

This makes us robust and highly efficient. It comes at a price: proneness to logical errors, forgetfulness, biases of all kinds, dissociation leading to health problems in body and mind. We should probably not want to create a duplicate of ourselves, even if that would be possible.

This is also about the age-old question of culture versus nature. [see: “Culture and Nature”]

Why should A.I. need to be fluid?

It is needed to be able to see reality as only a fluid system can see it. What we call ‘common sense’ certainly pertains to this domain.

In dealing with humans, only a fluid system can understand us as we naturally are. This needs to be reckoned with as A.I.-systems become more complex. We should strive not to robotize humans through A.I. applications. [see: “Robotizing Humans or Humanizing Robots”]

Semantic A.I.

= the use of knowledge graphs + present-day A.I. technologies + natural language processing.

A ‘semantic net’ is a knowledge graph, basically a huge bunch of concepts and conceptual links.

The aim of semantic A.I. is an integration of top-down processing (~knowledge graphs – deductive) and bottom-up processing (~deep neural nets – inductive). The former is solid; the latter is fluid. In this striving, it resembles the human case, but not in the way it tries to perform the combination. This last word is crucial. A combination is not a synthesis, no intrinsically integrated whole. Integration that may be achieved is not from-inside-out.

Semantic A.I. does not think parallel and distributed in a human way.

Compassionate A.I. takes proper care of fluidity.

As a general principle, solid A.I. – as any solidity – is only meaningful if it serves as a container to fluidity. This is by definition itself of meaningfulness. [see: “The Meaning of Meaning”] The information in a book is not meaningful by itself. It only becomes so through a (fluid) reader.

To meaningfully heal – from inside out, mentally and psycho-somatically – as well as to attain and enhance our typically human potential, fluidity is necessary. A coaching chatbot should be Compassionate.

In my view, of course, all A.I. should be Compassionate to a fair degree. [see: “The Journey Towards Compassionate A.I.”] This entails taking care of fluidity as is appropriate, profoundly thinking about the Compassionate side also when it doesn’t appear relevant at first sight.

Only this way does the future look Compassionate.

Leave a Reply

Related Posts

Why We don’t See What’s Around the Corner

The main why (of not seeing pending super-A.I.) is all about us. We need a next phase in self-understanding, but this is not getting realized yet. If we don’t see this, we don’t see that. This is an excerpt – in slightly different format – from my book The Journey Towards Compassionate A.I. Complex machinery Read the full article…

Subconceptual A.I. toward the Future

Every aspect of humanity is, to some extent, subconceptual. This perspective emphasizes the complexity and depth of human nature, which cannot be fully captured by surface-level concepts. Our intelligence stems from effectively navigating the subconceptual domain. This is hugely telling for the future of A.I. This indicates that Compassion will be essential in the future Read the full article…

Wisdom-Driven A.I. = Compassionate A.I.

Wisdom-driven A.I. taps into not just data-driven intelligence but a deeper form of understanding, much like Compassion itself. Please read Data-Driven vs. Wisdom-Driven A.I. ― Compassion, Basically ― Wisdom Emerges. With an abundance of time, please read The Journey Towards Compassionate A.I. Wisdom and Compassion, culturally In many ancient Eastern philosophies, wisdom and Compassion are Read the full article…

Translate »