{"id":11739,"date":"2023-03-29T12:18:00","date_gmt":"2023-03-29T12:18:00","guid":{"rendered":"https:\/\/aurelis.org\/blog\/?p=11739"},"modified":"2023-04-19T19:22:04","modified_gmt":"2023-04-19T19:22:04","slug":"a-i-is-dangerous-pausing-a-i-even-more","status":"publish","type":"post","link":"https:\/\/aurelis.org\/blog\/artifical-intelligence\/a-i-is-dangerous-pausing-a-i-even-more","title":{"rendered":"Shall we Put A.I. on Hold?"},"content":{"rendered":"\n<h3>Now and then, the admonition arises to put A.I. \u2013 or part of it \u2013 on hold to take some breath and think about possible dangers. There are pros and cons to this pause button.<\/h3>\n\n\n\n<p><strong>Doubtlessly, A.I. is challenging, as is any disruptive technology.<\/strong><\/p>\n\n\n\n<p>A.I. can disrupt on steroids. It\u2019s not just about automation \u2015 doing something with less effort. It\u2019s also about autonomization \u2015 deciding based on a self-made strategy. [Note that this differs from autonomation or \u2018intelligent automation\u2019 \u2015 automation with a human touch.]<\/p>\n\n\n\n<p>Due to the autonomization, A.I. can evolve toward performing larger sequential tasks with many different elements. In other words: it can not only let people do their jobs better. It can, to some degree, also do their jobs better.<\/p>\n\n\n\n<p><strong>No science fiction, soon science fact to a huge degree<\/strong><\/p>\n\n\n\n<p>We can be confident of many considerable advancements in the near future, even if only based on today\u2019s core technologies, with cognitive overhang drizzling down soon. Moreover, there surely will be more core advancements.<\/p>\n\n\n\n<p>Therefore, we need to think about many questions. The answers are urgently needed.<\/p>\n\n\n\n<p><strong>So, pause the whole thing for a while?<\/strong><\/p>\n\n\n\n<p>For instance, as has recently been asked by the <a href=\"https:\/\/futureoflife.org\/cause-area\/artificial-intelligence\/\">Future of Life Institute<\/a>. Note that they don\u2019t request a stop on the whole A.I. endeavor but mainly on the giant black box technologies (*) that are newsworthy nowadays. Who hasn\u2019t heard of Chat-GPT?<\/p>\n\n\n\n<p>I admire the institute. I&#8217;m an early <a href=\"https:\/\/futureoflife.org\/open-letter\/ai-principles\/\" target=\"_blank\" rel=\"noreferrer noopener\">endorser of their ASILOMAR A.I. Principles for safe A.I.<\/a> Their present concern is understandable. Also, I\u2019m not against this pausing. It is a correct attitude and speaks of profound responsibility.<\/p>\n\n\n\n<p><strong>There are pros and cons.<\/strong><\/p>\n\n\n\n<p>Since the pro (I see only one) is obvious, let\u2019s concentrate on a few cons which need to be taken into account, <em>especially<\/em> since it\u2019s such a crucial matter:<\/p>\n\n\n\n<p><strong>If the call succeeds, not the whole world will pause in this respect.<\/strong><\/p>\n\n\n\n<p>Rogue developers will see it as an opportunity to take advantage (or to lessen a disadvantage) in time. Think of dictatorial regimes, power-hungry individuals, or organizations. The winners will not be the open A.I. labs.<\/p>\n\n\n\n<p>Also, even the researchers in the open labs will not stop thinking about new technological developments. More and more, the thinking itself is the crucial part. More and more, the near future of A.I. is technological and philosophical at the same time. Unfortunately, philosophy is not humankind\u2019s strongest asset. Arguably, moral philosophy is lacking behind even more \u2015 say, some 2400 years, despite several turnarounds. Many engineers tend to be happily oblivious of Socrates\u2019 warning (the wise man knowing how little he knows) and think the issue can be solved with some tweaks in the software. It is preferable if they do so in public.<\/p>\n\n\n\n<p>Moreover, will we now suddenly close the gap?<\/p>\n\n\n\n<p><strong>I doubt that much thinking will be done in six months.<\/strong><\/p>\n\n\n\n<p>Therefore, a short time out to think might be somewhat misleading. Anyway, much more should have been done in the last six or sixty years \u2015 since that is the timeframe for the first A.I. huge promises to rise and fall.<\/p>\n\n\n\n<p>Will there be some regulations in place after six months? Maybe. Will they be relevant a year later? Probably not. Then, we have made many researchers\u2019 lives uneasy for little to nothing, possibly disappointing \u2013 or even losing \u2013 the brightest ones. It may lead to losing the expertise needed to tackle the most challenging problems.<\/p>\n\n\n\n<p><strong>Still, a strong sense of urgency is needed.<\/strong><\/p>\n\n\n\n<p>This can accompany a pause, but the uncertainty may also scare people away from thinking about it. Anxiety is never a good adviser. Frequently, it leads to bad decisions.<\/p>\n\n\n\n<p>With so much at stake, we must be careful without provoking anxiety.<\/p>\n\n\n\n<p>Therefore, my humble opinion is that pauses \u2013 now or in the future \u2013 should be handled cautiously.<\/p>\n\n\n\n<p>\u2015<\/p>\n\n\n\n<p>(*) \u2018Black\u2019 since nobody knows what happens inside, and it also cannot be explained meaningfully in human-understandable terms. Note that we ourselves are also \u2018black box\u2019 for the most part, even if many people think otherwise or don\u2019t care.<\/p>\n<div data-object_id=\"11739\" class=\"cbxwpbkmarkwrap cbxwpbkmarkwrap_no_cat cbxwpbkmarkwrap-post \"><a  data-redirect-url=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/11739\"  data-display-label=\"0\" data-show-count=\"0\" data-bookmark-label=\" \"  data-bookmarked-label=\" \"  data-loggedin=\"0\" data-type=\"post\" data-object_id=\"11739\" class=\"cbxwpbkmarktrig  cbxwpbkmarktrig-button-addto\" title=\"Bookmark This\" href=\"#\"><span class=\"cbxwpbkmarktrig-label\"  style=\"display:none;\" > <\/span><\/a> <div  data-type=\"post\" data-object_id=\"11739\" class=\"cbxwpbkmarkguestwrap\" id=\"cbxwpbkmarkguestwrap-11739\"><div class=\"cbxwpbkmarkguest-message\"><a href=\"#\" class=\"cbxwpbkmarkguesttrig_close\"><\/a><h3 class=\"cbxwpbookmark-title cbxwpbookmark-title-login\">Please login to bookmark<\/h3>\n\t\t<form name=\"loginform\" id=\"loginform\" action=\"https:\/\/aurelis.org\/blog\/wp-login.php\" method=\"post\">\n\t\t\t\n\t\t\t<p class=\"login-username\">\n\t\t\t\t<label for=\"user_login\">Username or Email Address<\/label>\n\t\t\t\t<input type=\"text\" name=\"log\" id=\"user_login\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t<p class=\"login-password\">\n\t\t\t\t<label for=\"user_pass\">Password<\/label>\n\t\t\t\t<input type=\"password\" name=\"pwd\" id=\"user_pass\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t\t<p class=\"login-remember\"><label><input name=\"rememberme\" type=\"checkbox\" id=\"rememberme\" value=\"forever\" \/> Remember Me<\/label><\/p>\n\t\t\t<p class=\"login-submit\">\n\t\t\t\t<input type=\"submit\" name=\"wp-submit\" id=\"wp-submit\" class=\"button button-primary\" value=\"Log In\" \/>\n\t\t\t\t<input type=\"hidden\" name=\"redirect_to\" value=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/11739\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t<\/form><\/div><\/div><\/div>","protected":false},"excerpt":{"rendered":"<p>Now and then, the admonition arises to put A.I. \u2013 or part of it \u2013 on hold to take some breath and think about possible dangers. There are pros and cons to this pause button. Doubtlessly, A.I. is challenging, as is any disruptive technology. A.I. can disrupt on steroids. It\u2019s not just about automation \u2015 <a class=\"moretag\" href=\"https:\/\/aurelis.org\/blog\/artifical-intelligence\/a-i-is-dangerous-pausing-a-i-even-more\">Read the full article&#8230;<\/a><\/p>\n<div data-object_id=\"11739\" class=\"cbxwpbkmarkwrap cbxwpbkmarkwrap_no_cat cbxwpbkmarkwrap-post \"><a  data-redirect-url=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/11739\"  data-display-label=\"0\" data-show-count=\"0\" data-bookmark-label=\" \"  data-bookmarked-label=\" \"  data-loggedin=\"0\" data-type=\"post\" data-object_id=\"11739\" class=\"cbxwpbkmarktrig  cbxwpbkmarktrig-button-addto\" title=\"Bookmark This\" href=\"#\"><span class=\"cbxwpbkmarktrig-label\"  style=\"display:none;\" > <\/span><\/a> <div  data-type=\"post\" data-object_id=\"11739\" class=\"cbxwpbkmarkguestwrap\" id=\"cbxwpbkmarkguestwrap-11739\"><div class=\"cbxwpbkmarkguest-message\"><a href=\"#\" class=\"cbxwpbkmarkguesttrig_close\"><\/a><h3 class=\"cbxwpbookmark-title cbxwpbookmark-title-login\">Please login to bookmark<\/h3>\n\t\t<form name=\"loginform\" id=\"loginform\" action=\"https:\/\/aurelis.org\/blog\/wp-login.php\" method=\"post\">\n\t\t\t\n\t\t\t<p class=\"login-username\">\n\t\t\t\t<label for=\"user_login\">Username or Email Address<\/label>\n\t\t\t\t<input type=\"text\" name=\"log\" id=\"user_login\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t<p class=\"login-password\">\n\t\t\t\t<label for=\"user_pass\">Password<\/label>\n\t\t\t\t<input type=\"password\" name=\"pwd\" id=\"user_pass\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t\t<p class=\"login-remember\"><label><input name=\"rememberme\" type=\"checkbox\" id=\"rememberme\" value=\"forever\" \/> Remember Me<\/label><\/p>\n\t\t\t<p class=\"login-submit\">\n\t\t\t\t<input type=\"submit\" name=\"wp-submit\" id=\"wp-submit\" class=\"button button-primary\" value=\"Log In\" \/>\n\t\t\t\t<input type=\"hidden\" name=\"redirect_to\" value=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/11739\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t<\/form><\/div><\/div><\/div>","protected":false},"author":2,"featured_media":11758,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"spay_email":"","jetpack_publicize_message":""},"categories":[28],"tags":[],"jetpack_featured_media_url":"https:\/\/i1.wp.com\/aurelis.org\/blog\/wp-content\/uploads\/2023\/03\/2069-2.jpg?fit=960%2C560&ssl=1","jetpack_publicize_connections":[],"jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p9Fdiq-33l","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/11739"}],"collection":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/comments?post=11739"}],"version-history":[{"count":11,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/11739\/revisions"}],"predecessor-version":[{"id":11987,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/11739\/revisions\/11987"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/media\/11758"}],"wp:attachment":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/media?parent=11739"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/categories?post=11739"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/tags?post=11739"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}