{"id":14235,"date":"2024-01-19T12:28:39","date_gmt":"2024-01-19T12:28:39","guid":{"rendered":"https:\/\/aurelis.org\/blog\/?p=14235"},"modified":"2024-01-20T16:39:28","modified_gmt":"2024-01-20T16:39:28","slug":"a-i-is-in-the-size","status":"publish","type":"post","link":"https:\/\/aurelis.org\/blog\/artifical-intelligence\/a-i-is-in-the-size","title":{"rendered":"&#8220;A.I. is in the Size.&#8221;"},"content":{"rendered":"\n<h3>This famous quote by R.C. Schank (1991) gets new relevance with GPT technology \u2015 in a surprisingly different way.<\/h3>\n\n\n\n<p><strong>How Shank interpreted his quote<\/strong><\/p>\n\n\n\n<p>He meant that one cannot conclude \u2018intelligence\u2019 from a simple demo \u2015 as was usual at that time of purely conceptual GOFAI (Good Old-Fashioned A.I.). At that time, many Ph.D. students showed \u2018intelligence\u2019 within a system by just, for instance, letting it translate a few sentences.<\/p>\n\n\n\n<p>Shank taught that one has to scale the system\u2019s performance to see whether the demo still acts intelligently. Because, he said, otherwise, it\u2019s just a simulation of intelligence.<\/p>\n\n\n\n<p><strong>Anno 2024, we see scaled systems acting intelligently \u2015 or do we?<\/strong><\/p>\n\n\n\n<p>What happened with GPT: Within a relatively simple paradigm, the sheer augmentation of parameters and data source(s) has shown an unexpected (even to the developers) emergence of the system\u2019s competencies. Such a system is called a foundation model because it can be used as a foundation for many concrete applications \u2015 being generally applicable.<\/p>\n\n\n\n<p>Here are, indeed, elements of whatever one may call \u2018intelligence.\u2019<\/p>\n\n\n\n<p>Something special happens due to size.<\/p>\n\n\n\n<p><strong>This is also the case for natural (our human) intelligence.<\/strong><\/p>\n\n\n\n<p>Here, too, it\u2019s in the size, as becomes apparent by delving into <a href=\"https:\/\/aurelis.org\/blog?p=14228\">evolutionary matters concerning the brain<\/a>.<\/p>\n\n\n\n<p>Are we and GPT, therefore, just two examples of the same principle?<\/p>\n\n\n\n<p><strong>The \u2018Chinese room\u2019 thought experiment [J. Searle, 1980<\/strong>]<\/p>\n\n\n\n<p>Here also, a simple mechanism is involved, and a complex result is attained.<\/p>\n\n\n\n<p>Imagine a conversation in Chinese without anyone or anything comprehending Chinese. The size here lies in a gigantic lookup table of Chinese&lt;-&gt;English phrases used by a person who doesn\u2019t know one Chinese character. Searle argued that, as in this case, something can seem intelligent by acting intelligently but without being intelligent. Because, he said, a lookup table is not intelligent. It\u2019s just a simulation of intelligence.<\/p>\n\n\n\n<p>Can the argument be reversed now? Something is intelligent when it acts intelligently \u2014 even when the internals are \u2018just\u2019 a gigantic number of elements (as in an immense lookup table) and a pretty simple but to-the-spot way of handling that amount.<\/p>\n\n\n\n<p><strong>\u201cIs something intelligent?\u201d is only half a question.<\/strong><\/p>\n\n\n\n<p>Thus, there is no single answer.<\/p>\n\n\n\n<p>It is better to differentiate between implicit vs. explicit intelligence \u2014 or competence vs. comprehension, system-1 vs. system-2, or some other distinction in this direction.<\/p>\n\n\n\n<p>Let\u2019s stick to the first, going a bit deeper before answering any question.<\/p>\n\n\n\n<p><strong><a href=\"https:\/\/aurelis.org\/blog?p=12740\">Implicit \u2014 explicit<\/a><\/strong><\/p>\n\n\n\n<p>\u2018Explicit\u2019 can also reside in how the implicit presents itself (interface-dependent) between many modules. These modules can be concepts, for instance, or what we call thoughts and feelings. The internals of any module may be implicit. It\u2019s enough that a module acts explicitly at its interface because other modules only see this.<\/p>\n\n\n\n<p>Of course, the module is also expected to act consistently the same way \u2013 more or less \u2013 each time it is called upon.<\/p>\n\n\n\n<p><strong>Welcome to how we think.<\/strong><\/p>\n\n\n\n<p>With our neurons and synapses continually in motion \u2013 being alive \u2013 we \u201cnever think the same thought twice,\u201d even though we are generally little aware of this. Our modules (thoughts) are explicit only more or less and only at their interface.<\/p>\n\n\n\n<p>Also, we don\u2019t need to be perfect explicitly. Good enough is, well, enough to survive and thrive in a natural environment. When we want to be perfect, we have to invent mathematics \u2015 which we did.<\/p>\n\n\n\n<p><strong>For implicit intelligence, size + a simple mechanism is enough.<\/strong><\/p>\n\n\n\n<p>This is what we see in present-day GPT. Thus, the answer to \u201cIs it intelligent?\u201d is: implicitly, yes. It is competent to a surprising degree.<\/p>\n\n\n\n<p>But explicitly? At this time, to a much smaller degree. Although it can handle explicit knowledge, it does so in a very implicit way. It lacks comprehension. It can gain that in many ways and apparently, that\u2019s what is going on now.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote\"><p>Which, of course, is dangerous! Without Compassionate input, humanity may become just one more temporary hominid all too quickly.<\/p><\/blockquote>\n\n\n\n<p><strong>Two lessons to \u201cA.I. is in the size.\u201d<\/strong><\/p>\n\n\n\n<ul><li>This quote is, in its new dress, very applicable to implicit intelligence. In this sense, we can be sure that GPT is only one example that has been stumbled upon. We may see implicit intelligence also readily emerge with other kinds. It\u2019s in the size.<\/li><li>The other lesson is that the same principle may also be applicable to explicit intelligence. To attain this, we may need another kind of element \u2015 probably a more formalized kind. But apart from that, here too, it\u2019s in the size.<\/li><\/ul>\n\n\n\n<p><strong>Size matters.<\/strong><\/p>\n\n\n\n<p>Of course, what also matters is the kind of elements involved and the way of handling them.<\/p>\n\n\n\n<p>Note that this is \u2018only\u2019 about intelligence. How this will be used and how it will use itself appears to be another matter for now.<\/p>\n\n\n\n<p>Will super-A.I. be \u2018smarter\u2019 than us also in its wisdom?<\/p>\n<div data-object_id=\"14235\" class=\"cbxwpbkmarkwrap cbxwpbkmarkwrap_no_cat cbxwpbkmarkwrap-post \"><a  data-redirect-url=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/14235\"  data-display-label=\"0\" data-show-count=\"0\" data-bookmark-label=\" \"  data-bookmarked-label=\" \"  data-loggedin=\"0\" data-type=\"post\" data-object_id=\"14235\" class=\"cbxwpbkmarktrig  cbxwpbkmarktrig-button-addto\" title=\"Bookmark This\" href=\"#\"><span class=\"cbxwpbkmarktrig-label\"  style=\"display:none;\" > <\/span><\/a> <div  data-type=\"post\" data-object_id=\"14235\" class=\"cbxwpbkmarkguestwrap\" id=\"cbxwpbkmarkguestwrap-14235\"><div class=\"cbxwpbkmarkguest-message\"><a href=\"#\" class=\"cbxwpbkmarkguesttrig_close\"><\/a><h3 class=\"cbxwpbookmark-title cbxwpbookmark-title-login\">Please login to bookmark<\/h3>\n\t\t<form name=\"loginform\" id=\"loginform\" action=\"https:\/\/aurelis.org\/blog\/wp-login.php\" method=\"post\">\n\t\t\t\n\t\t\t<p class=\"login-username\">\n\t\t\t\t<label for=\"user_login\">Username or Email Address<\/label>\n\t\t\t\t<input type=\"text\" name=\"log\" id=\"user_login\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t<p class=\"login-password\">\n\t\t\t\t<label for=\"user_pass\">Password<\/label>\n\t\t\t\t<input type=\"password\" name=\"pwd\" id=\"user_pass\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t\t<p class=\"login-remember\"><label><input name=\"rememberme\" type=\"checkbox\" id=\"rememberme\" value=\"forever\" \/> Remember Me<\/label><\/p>\n\t\t\t<p class=\"login-submit\">\n\t\t\t\t<input type=\"submit\" name=\"wp-submit\" id=\"wp-submit\" class=\"button button-primary\" value=\"Log In\" \/>\n\t\t\t\t<input type=\"hidden\" name=\"redirect_to\" value=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/14235\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t<\/form><\/div><\/div><\/div>","protected":false},"excerpt":{"rendered":"<p>This famous quote by R.C. Schank (1991) gets new relevance with GPT technology \u2015 in a surprisingly different way. How Shank interpreted his quote He meant that one cannot conclude \u2018intelligence\u2019 from a simple demo \u2015 as was usual at that time of purely conceptual GOFAI (Good Old-Fashioned A.I.). At that time, many Ph.D. students <a class=\"moretag\" href=\"https:\/\/aurelis.org\/blog\/artifical-intelligence\/a-i-is-in-the-size\">Read the full article&#8230;<\/a><\/p>\n<div data-object_id=\"14235\" class=\"cbxwpbkmarkwrap cbxwpbkmarkwrap_no_cat cbxwpbkmarkwrap-post \"><a  data-redirect-url=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/14235\"  data-display-label=\"0\" data-show-count=\"0\" data-bookmark-label=\" \"  data-bookmarked-label=\" \"  data-loggedin=\"0\" data-type=\"post\" data-object_id=\"14235\" class=\"cbxwpbkmarktrig  cbxwpbkmarktrig-button-addto\" title=\"Bookmark This\" href=\"#\"><span class=\"cbxwpbkmarktrig-label\"  style=\"display:none;\" > <\/span><\/a> <div  data-type=\"post\" data-object_id=\"14235\" class=\"cbxwpbkmarkguestwrap\" id=\"cbxwpbkmarkguestwrap-14235\"><div class=\"cbxwpbkmarkguest-message\"><a href=\"#\" class=\"cbxwpbkmarkguesttrig_close\"><\/a><h3 class=\"cbxwpbookmark-title cbxwpbookmark-title-login\">Please login to bookmark<\/h3>\n\t\t<form name=\"loginform\" id=\"loginform\" action=\"https:\/\/aurelis.org\/blog\/wp-login.php\" method=\"post\">\n\t\t\t\n\t\t\t<p class=\"login-username\">\n\t\t\t\t<label for=\"user_login\">Username or Email Address<\/label>\n\t\t\t\t<input type=\"text\" name=\"log\" id=\"user_login\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t<p class=\"login-password\">\n\t\t\t\t<label for=\"user_pass\">Password<\/label>\n\t\t\t\t<input type=\"password\" name=\"pwd\" id=\"user_pass\" class=\"input\" value=\"\" size=\"20\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t\t<p class=\"login-remember\"><label><input name=\"rememberme\" type=\"checkbox\" id=\"rememberme\" value=\"forever\" \/> Remember Me<\/label><\/p>\n\t\t\t<p class=\"login-submit\">\n\t\t\t\t<input type=\"submit\" name=\"wp-submit\" id=\"wp-submit\" class=\"button button-primary\" value=\"Log In\" \/>\n\t\t\t\t<input type=\"hidden\" name=\"redirect_to\" value=\"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/14235\" \/>\n\t\t\t<\/p>\n\t\t\t\n\t\t<\/form><\/div><\/div><\/div>","protected":false},"author":2,"featured_media":14266,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"spay_email":"","jetpack_publicize_message":""},"categories":[28],"tags":[],"jetpack_featured_media_url":"https:\/\/i2.wp.com\/aurelis.org\/blog\/wp-content\/uploads\/2024\/01\/2316-1.jpg?fit=960%2C560&ssl=1","jetpack_publicize_connections":[],"jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p9Fdiq-3HB","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/14235"}],"collection":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/comments?post=14235"}],"version-history":[{"count":8,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/14235\/revisions"}],"predecessor-version":[{"id":14270,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/posts\/14235\/revisions\/14270"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/media\/14266"}],"wp:attachment":[{"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/media?parent=14235"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/categories?post=14235"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/aurelis.org\/blog\/wp-json\/wp\/v2\/tags?post=14235"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}