We Didn’t Know ― but We Could Have Known

October 1, 2023 Sociocultural Issues No Comments

The first assertion is frequently used as an excuse. But is it enough? This issue is getting more and more consequential with enhancing technological possibilities.

When Albert Speer – ‘Hitler’s architect’ – was asked after the war whether he knew the atrocities the Jews were undergoing years before, he answered that “whether he knew” was the wrong question. The right question was “whether he could have known.”

We can know a lot if we don’t run away from the knowledge.

Such as in active denial during COVID time. The consequence runs up to a surplus mortality of 50 million ― no kidding. ‘We’ definitely ‘could’ have known. The science would have been relatively easy.

But the people I still try to reach – toward the next pandemic – had and have more urgent things to do.

Then comes A.I.

It will probably make the environment more complex in the foreseeable future. After that, super-A.I. may ensure that complexity stays within humane proportions ― providing a human-level interface and explainability.

Fortunately, the complexity of A.I. will also enable us to make visible the complexity of ourselves. If we draw the correct lessons, this self-knowledge may be amazingly positive.

Getting past our basic cognitive illusion

In principle, we could have gotten there a few thousand years ago. However, it’s challenging to want to see what is doing the seeing ― namely, your deeper self. Meanwhile, this does a myriad of things: regulating your breadth and heart rate, engendering any of your thoughts and feelings, etc.

Philosophy could have been a big help in this.

Too bad it hasn’t.

That makes this a case of you don’t see until you see — apparently.

But I don’t dig that because there is also a gradual aspect involved. If you do your best, you can see at least something through the mist.

As with the holocaust

Of course, everybody in Europe and many people outside of the old continent knew that something sinister was going on already in the mid to late thirties ― the closer to the disaster, the clearer.

As with healthcare

Of course, any physician – of many millions – can discern that some deeper part of the human being is missing in the theory and practice, with atrocious consequences ubiquitously and continually.

As with philosophy

Of course, any philosopher with some daring can do the little research needed to send him on a path essentially different from the mainstream.

Then why (not)?

Look at the daring, as in the last sentence. One can un-dare oneself into not wanting — from there, into not seeing. Most philosophers, for instance, know very well what this is about. At least, they could/should know what it takes.

Everybody can start with open eyes.

One strong invitation for doing so is AURELIS.

Of course, it might get you where you do not want to be until you get there, after which you might not want to go back.

What is most frightening?

Leave a Reply

Related Posts

Poetry is Not the Absence of Rationality

Nevertheless, to many people, it seems to be so. I find this dangerous. It sets both against each other and precludes a nice being-human in full potential. [see also: “Rationality and Poetry”] Mistrusting ‘the expert’ Experts frequently bend on their rational stance to lend them authority to speak and even decide what others should do. Read the full article…

How Long will Humanism Stay Asleep?

Many global problems will only become worse if this basic philosophical issue doesn’t get resolved in most people’s minds. It’s as – immensely! – crucial as that. Also, as simple as that. Another tradition may explain. ‘Buddha’ literally means ‘the awakened.’ [see: “Buddha-Nature“] In modern scientific terms, this means awakening from the dual Cartesian dream. Read the full article…

The Post-Postmodernist Brain

“because your experiences are unique, so are the vast, detailed patterns in your neural networks. Because they continue to change your whole life, your identity is a moving target; it never reaches an endpoint.” [1] This is a quote from a recent book on neurocognitive science. The book (and author) is top-notch in this domain. Read the full article…

Translate »