{"id":9025,"date":"2022-01-01T08:11:28","date_gmt":"2022-01-01T08:11:28","guid":{"rendered":"https:\/\/aurelis.org\/blog\/?p=9025"},"modified":"2022-06-07T12:48:28","modified_gmt":"2022-06-07T12:48:28","slug":"will-a-i-soon-be-smarter-than-us","status":"publish","type":"post","link":"https:\/\/aurelis.org\/blog\/artifical-intelligence\/will-a-i-soon-be-smarter-than-us","title":{"rendered":"Will A.I. Soon be Smarter than Us?"},"content":{"rendered":"\n<h3>This text may be interesting to many because these ideas may shape the future of those many to the highest degree. It\u2019s smart to see why something else will be even smarter.<\/h3>\n\n\n\n<p><strong>Soon?<\/strong><\/p>\n\n\n\n<p>Soon enough. The ongoing evolution toward the title\u2019s state will not be evident. In retrospect, it will be an amazingly rash evolution. By then, we may wonder why so many didn\u2019t see it coming.<\/p>\n\n\n\n<p>But, agreed, we\u2019re not there yet.<\/p>\n\n\n\n<p><strong>Why smarter? The short answer.<\/strong><\/p>\n\n\n\n<p>The short answer is Moore\u2019s Law: a continual, exponentially falling cost per computation unit.<\/p>\n\n\n\n<p>That\u2019s it. This may feel a bit disappointing. Will we be outsmarted because something else becomes cheaper? Is this not disrespectful? Is it not\u2026 inhumane?<\/p>\n\n\n\n<p>Yes, well, not necessarily, but possibly \u2015 one more reason to delve into this before it\u2019s too late.<\/p>\n\n\n\n<p><strong>The challenge of past A.I.<\/strong><\/p>\n\n\n\n<p>The A.I. of a few decades ago was oriented upon the leveraging of human knowledge. This had several reasons, one of them being the neglect of Moore\u2019s Law. Implicitly, A.I. was envisioned within a computation constraint resembling the one (that was thought of as the one) of ourselves. Massively more computation to the disposition of A.I. was not conceived.<\/p>\n\n\n\n<p>Also, there was no sufficient conception of strangely, challengingly different A.I. computations than those en vogue at that time. Deep Neural Networks did exist, but they weren\u2019t taken seriously (enough) \u2014 the same with reinforcement learning.<\/p>\n\n\n\n<p>Now, it\u2019s becoming increasingly evident that these two technology areas may together shape the future.<\/p>\n\n\n\n<p><strong>A.I. is in the size.<\/strong><\/p>\n\n\n\n<p>Not so much the size of more of the same of what seems to work somewhat, but the size of what enables new frameworks and original exploitations of existing frameworks. Fringe ideas may thus be leveraged by unexpected opportunities, overpowering what may seem straightforward at first. The fringe ideas are sometimes much simpler, such as statistics instead of ideas about the problem-solving human mind. How we think we think is not always correct, nor most efficient, nor most scalable with increasing computation capacity.<\/p>\n\n\n\n<p>We may still have more power under the skull than any computer, yet Moore\u2019s Law does not apply to humankind \u2015except if one wants to become a cyborg (half-human, half-machine). In short, we are stuck with ourselves.<\/p>\n\n\n\n<p><strong>A.I. will increasingly emulate us, and it will increasingly not.<\/strong><\/p>\n\n\n\n<p>Human learning and knowledge representation methods are interesting but primarily not suited to leveraging silicon computation. A.I. will take advantage of the Law in any way. We have already seen this in A.I.\u2019s overtaking of chess masters, then Go masters, and more. A.I. is already becoming \u2018smarter\u2019 than us in many distinctive domains.<\/p>\n\n\n\n<p>Still, we may try to hide in our being smarter in many fields simultaneously. The question is, for how long?<\/p>\n\n\n\n<p><strong>The future lies in self-scalability.<\/strong><\/p>\n\n\n\n<p>A self-scalable system is self-learning, taking advantage of increasing hardware possibilities. This probably means that the backbone will resemble something like reinforcement learning, in a combination of planning (model-based) and searching (not model-based) \u2015 also, in a combination of conceptual and subconceptual processing. This is just my idea.<\/p>\n\n\n\n<p>This is also crucial in view of the sheer endless complexity of the world outside as well as inside of us, let alone of some intelligence that surpasses us. Such complexity cannot be conceptually emulated. Necessary for this is a system that self-scales in a process of search and discovery. Self-scaling, the system doesn&#8217;t only exhibit what the engineer has put into it. Thus, it may be called &#8216;intelligent&#8217; in the first place.<\/p>\n\n\n\n<p>In any case, the meta-method of self-scalability is a must for any A.I. to reach super-A.I. and go beyond. We humans don\u2019t even know what lies beyond. Will we be able to understand? Probably not, because there is no end to the progress. The bow doesn\u2019t follow the arrow, which by itself becomes another supercharged bow that again doesn\u2019t follow the arrow.<\/p>\n\n\n\n<p>In competition, we have no chance.<\/p>\n\n\n\n<p>Let\u2019s hope for something different.<\/p>\n<div data-object_id=\"9025\" class=\"cbxwpbkmarkwrap cbxwpbkmarkwrap_no_cat cbxwpbkmarkwrap-post \"><a  data-redirect-url=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/9025\"  data-display-label=\"0\" data-show-count=\"0\" data-bookmark-label=\" \"  data-bookmarked-label=\" \"  data-loggedin=\"0\" data-type=\"post\" data-object_id=\"9025\" class=\"cbxwpbkmarktrig  cbxwpbkmarktrig-button-addto\" title=\"Bookmark This\" href=\"#\"><span class=\"cbxwpbkmarktrig-label\"  style=\"display:none;\" > <\/span><\/a> <div  data-type=\"post\" data-object_id=\"9025\" class=\"cbxwpbkmarkguestwrap\" id=\"cbxwpbkmarkguestwrap-9025\"><div class=\"cbxwpbkmarkguest-message\"><a href=\"#\" class=\"cbxwpbkmarkguesttrig_close\"><\/a><h3 class=\"cbxwpbookmark-title cbxwpbookmark-title-login\">Please login to bookmark<\/h3>\n\t\t<form name=\"loginform\" id=\"loginform\" action=\"https:\/\/aurelis.org\/blog\/wp-login.php\" method=\"post\">\n\t\t\t\n\t\t\t<p class=\"login-username\">\n\t\t\t\t<label for=\"user_login\">Username or Email Address<\/label>\n\t\t\t\t<input type=\"text\" name=\"log\" id=\"user_login\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t<p class=\"login-password\">\n\t\t\t\t<label for=\"user_pass\">Password<\/label>\n\t\t\t\t<input type=\"password\" name=\"pwd\" id=\"user_pass\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t\t<p class=\"login-remember\"><label><input name=\"rememberme\" type=\"checkbox\" id=\"rememberme\" value=\"forever\" \/> Remember Me<\/label><\/p>\n\t\t\t<p class=\"login-submit\">\n\t\t\t\t<input type=\"submit\" name=\"wp-submit\" id=\"wp-submit\" class=\"button button-primary\" value=\"Log In\" \/>\n\t\t\t\t<input type=\"hidden\" name=\"redirect_to\" value=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/9025\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t<\/form><\/div><\/div><\/div>","protected":false},"excerpt":{"rendered":"<p>This text may be interesting to many because these ideas may shape the future of those many to the highest degree. It\u2019s smart to see why something else will be even smarter. Soon? Soon enough. The ongoing evolution toward the title\u2019s state will not be evident. In retrospect, it will be an amazingly rash evolution. <a class=\"moretag\" href=\"https:\/\/aurelis.org\/blog\/artifical-intelligence\/will-a-i-soon-be-smarter-than-us\">Read the full article&#8230;<\/a><\/p>\n<div data-object_id=\"9025\" class=\"cbxwpbkmarkwrap cbxwpbkmarkwrap_no_cat cbxwpbkmarkwrap-post \"><a  data-redirect-url=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/9025\"  data-display-label=\"0\" data-show-count=\"0\" data-bookmark-label=\" \"  data-bookmarked-label=\" \"  data-loggedin=\"0\" data-type=\"post\" data-object_id=\"9025\" class=\"cbxwpbkmarktrig  cbxwpbkmarktrig-button-addto\" title=\"Bookmark This\" href=\"#\"><span class=\"cbxwpbkmarktrig-label\"  style=\"display:none;\" > <\/span><\/a> <div  data-type=\"post\" data-object_id=\"9025\" class=\"cbxwpbkmarkguestwrap\" id=\"cbxwpbkmarkguestwrap-9025\"><div class=\"cbxwpbkmarkguest-message\"><a href=\"#\" class=\"cbxwpbkmarkguesttrig_close\"><\/a><h3 class=\"cbxwpbookmark-title cbxwpbookmark-title-login\">Please login to bookmark<\/h3>\n\t\t<form name=\"loginform\" id=\"loginform\" action=\"https:\/\/aurelis.org\/blog\/wp-login.php\" method=\"post\">\n\t\t\t\n\t\t\t<p class=\"login-username\">\n\t\t\t\t<label for=\"user_login\">Username or Email Address<\/label>\n\t\t\t\t<input type=\"text\" name=\"log\" id=\"user_login\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t<p class=\"login-password\">\n\t\t\t\t<label for=\"user_pass\">Password<\/label>\n\t\t\t\t<input type=\"password\" name=\"pwd\" id=\"user_pass\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t\t<p class=\"login-remember\"><label><input name=\"rememberme\" type=\"checkbox\" id=\"rememberme\" value=\"forever\" \/> Remember Me<\/label><\/p>\n\t\t\t<p class=\"login-submit\">\n\t\t\t\t<input type=\"submit\" name=\"wp-submit\" id=\"wp-submit\" class=\"button button-primary\" value=\"Log In\" \/>\n\t\t\t\t<input type=\"hidden\" name=\"redirect_to\" value=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/9025\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t<\/form><\/div><\/div><\/div>","protected":false},"author":2,"featured_media":9030,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"spay_email":"","jetpack_publicize_message":""},"categories":[28],"tags":[],"jetpack_featured_media_url":"https:\/\/i2.wp.com\/aurelis.org\/blog\/wp-content\/uploads\/2022\/06\/1667-2.jpg?fit=961%2C559&ssl=1","jetpack_publicize_connections":[],"jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p9Fdiq-2lz","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/9025"}],"collection":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/comments?post=9025"}],"version-history":[{"count":7,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/9025\/revisions"}],"predecessor-version":[{"id":9036,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/9025\/revisions\/9036"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/media\/9030"}],"wp:attachment":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/media?parent=9025"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/categories?post=9025"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/tags?post=9025"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}