{"id":221012,"date":"2025-08-27T00:45:43","date_gmt":"2025-08-27T05:45:43","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2025\/08\/for-the-singularity-to-truly-arrive-wed-need-a-machine-that-eats-the-sun"},"modified":"2025-08-27T00:45:43","modified_gmt":"2025-08-27T05:45:43","slug":"for-the-singularity-to-truly-arrive-wed-need-a-machine-that-eats-the-sun","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2025\/08\/for-the-singularity-to-truly-arrive-wed-need-a-machine-that-eats-the-sun","title":{"rendered":"For the Singularity to Truly Arrive, We\u2019d Need a Machine That Eats the Sun"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/for-the-singularity-to-truly-arrive-wed-need-a-machine-that-eats-the-sun2.jpg\"><\/a><\/p>\n<p>However, if you\u2019re rich and you don\u2019t like the idea of a limit on computing, you can turn to futurism, longtermism, or \u201cAI optimism,\u201d depending on your favorite flavor. People in these camps believe in developing AI as fast as possible so we can (they claim) keep guardrails in place that will prevent AI from going rogue or becoming evil. (Today, people can\u2019t seem to\u2014or don\u2019t want to\u2014control whether or not their chatbots become racist, <a href=\"https:\/\/www.bbc.com\/news\/articles\/c3dpmlvx1k2o\" target=\"_blank\" class=\"\">are \u201csensual\u201d with children<\/a>, or <a href=\"https:\/\/www.psychologytoday.com\/us\/blog\/urban-survival\/202507\/the-emerging-problem-of-ai-psychosis\" target=\"_blank\" class=\"\">induce psychosis<\/a> in the general population, but sure.) <\/p>\n<p>The goal of these AI boosters is known as artificial general intelligence, or AGI. They theorize, or even hope for, an AI so powerful that it thinks like\u2026 well\u2026 a human mind whose ability is enhanced by a billion computers. If someone ever does develop an AGI that surpasses human intelligence, that moment is known as the AI singularity. (There are other, unrelated singularities in physics.) AI optimists want to accelerate the singularity and usher in this \u201cgodlike\u201d <a href=\"https:\/\/www.popularmechanics.com\/technology\/robots\/a64423163\/turing-test-gpt-45\/\" target=\"_blank\" class=\"\">AGI<\/a>.<\/p>\n<p>One of the key facts of computer logic is that, if you can slow the processes down enough and look at it in enough detail, you can track and predict every single thing that a program will do. <a href=\"https:\/\/www.popularmechanics.com\/technology\/robots\/a26309827\/left-to-their-own-devices-pricing-algorithms-resort-to-collusion\/\" target=\"_blank\" class=\"\">Algorithms<\/a> (and not the opaque AI kind) guide everything within a computer. Over the decades, experts have written the exact ways information can be sent, one bit\u2014one minuscule electrical zap\u2014at a time through a central processing unit (CPU).<\/p>\n","protected":false},"excerpt":{"rendered":"<p>However, if you\u2019re rich and you don\u2019t like the idea of a limit on computing, you can turn to futurism, longtermism, or \u201cAI optimism,\u201d depending on your favorite flavor. People in these camps believe in developing AI as fast as possible so we can (they claim) keep guardrails in place that will prevent AI from [\u2026]<\/p>\n","protected":false},"author":718,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[41,219,6,64],"tags":[],"class_list":["post-221012","post","type-post","status-publish","format-standard","hentry","category-information-science","category-physics","category-robotics-ai","category-singularity"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/221012","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/718"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=221012"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/221012\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=221012"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=221012"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=221012"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}