{"id":167653,"date":"2023-07-16T14:22:41","date_gmt":"2023-07-16T19:22:41","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2023\/07\/from-sci-fi-to-reality-addressing-ai-risks-with-david-brin"},"modified":"2023-07-16T14:22:41","modified_gmt":"2023-07-16T19:22:41","slug":"from-sci-fi-to-reality-addressing-ai-risks-with-david-brin","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2023\/07\/from-sci-fi-to-reality-addressing-ai-risks-with-david-brin","title":{"rendered":"From Sci-Fi to Reality: Addressing AI Risks \u2014 with David Brin"},"content":{"rendered":"<p><\/p>\n<p><iframe style=\"display: block; margin: 0 auto; width: 100%; aspect-ratio: 4\/3; object-fit: contain;\" src=\"https:\/\/www.youtube.com\/embed\/EIIUYzfjFRM?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope;\n   picture-in-picture\" allowfullscreen><\/iframe><\/p>\n<p>AI had its nuclear bomb threshold. The biggest thing that happens to human technology maybe since the splitting of the atom.<\/p>\n<p>A conversation with Science Fiction author and a NASA consultant David Brin about the existential risks of AI and what approach we can take to address these risks.<\/p>\n<hr>\n<p>David Brin\u2019s advice for new authors.<br \/>\n<a href=\"http:\/\/www.davidbrin.com\/advice.htm\">http:\/\/www.davidbrin.com\/advice.htm<\/a>.<\/p>\n<p>David Brin\u2019s new WIRED article appraises the chances (nil) of an \u2018AI moratorium.\u2019 It then breaks down the three standard \u2018AI-formats\u2019 implicitly assumed by almost everyone in the field \u2013 corporate puppet, invasive blob, or \u2018Skynet\u2019. Formats that can only lead to disaster.<br \/> He propose instead a 4th \u2014 that AI entities just might be held accountable if they have individuality\u2026 even \u2018soul\u2019\u2026 <a href=\"https:\/\/www.wired.com\/story\/give-ever\">https:\/\/www.wired.com\/story\/give-ever<\/a>\u2026 Brin\u2019s related NEWSWEEK op-ed (June\u201922) dealt with \u2018empathy bots\u201d that feign sapience. <a href=\"https:\/\/www.newsweek.com\/soon-humanit\">https:\/\/www.newsweek.com\/soon-humanit<\/a>\u2026 Here also is a YouTube pod where I give an expanded version: <img alt=\"\" class=\"\"> \u2022 AI is Alive! Or i\u2026 Here\u2019s how all those fervid calls for an \u201cAI moratorium\u201d are doomed to fail. <a href=\"https:\/\/davidbrin.blogspot.com\/2023\/0\">https:\/\/davidbrin.blogspot.com\/2023\/0<\/a>\u2026 David Brin\u2019s website <a href=\"http:\/\/www.davidbrin.com\">http:\/\/www.davidbrin.com<\/a> David Brin latest book VIVID TOMORROWS: Science Fiction and Hollywood \u2014 <a href=\"http:\/\/www.davidbrin.com\/vividtomorro\">http:\/\/www.davidbrin.com\/vividtomorro<\/a>\u2026 David Brin\u2019s blog (Contrary Brin blog) <a href=\"http:\/\/davidbrin.blogspot.com\/\">http:\/\/davidbrin.blogspot.com\/<\/a> Links: Quora blog: <a href=\"https:\/\/spacefaringcivilization.quora\">https:\/\/spacefaringcivilization.quora<\/a>\u2026 Amazon Author page: <a href=\"http:\/\/amazon.com\/author\/ronfriedman\">http:\/\/amazon.com\/author\/ronfriedman<\/a> My Website: <a href=\"https:\/\/ronsfriedman.wordpress.com\/\">https:\/\/ronsfriedman.wordpress.com\/<\/a> Subscribe to my mailing list: <a href=\"https:\/\/ronsfriedman.wordpress.com\/su\">https:\/\/ronsfriedman.wordpress.com\/su<\/a>\u2026 How to support the channel: Get $5 in NDAX (Canadian Crypto Exchange): <a href=\"https:\/\/refer.ndax.io\/vm1j\">https:\/\/refer.ndax.io\/vm1j<\/a> Buy Escape Velocity short stories collection: Support with Ethereum or Plygon donation: sciandscifi.nft.<br \/>\n<a href=\"https:\/\/www.wired.com\/story\/give-every-ai-a-soul-or-else\/\">https:\/\/www.wired.com\/story\/give-every-ai-a-soul-or-else\/<\/a><\/p>\n<p>David Brin\u2019s related NEWSWEEK op-ed (June\u201922) dealt with \u2018empathy bots\u201d that feign sapience.<\/p>\n<div class=\"more-link-wrapper\"> <a class=\"more-link\" href=\"https:\/\/lifeboat.com\/blog\/2023\/07\/from-sci-fi-to-reality-addressing-ai-risks-with-david-brin\">Continue reading \u201cFrom Sci-Fi to Reality: Addressing AI Risks \u2014 with David Brin\u201d | &gt;<\/a><\/div><\/p>\n","protected":false},"excerpt":{"rendered":"<p>AI had its nuclear bomb threshold. The biggest thing that happens to human technology maybe since the splitting of the atom. A conversation with Science Fiction author and a NASA consultant David Brin about the existential risks of AI and what approach we can take to address these risks. David Brin\u2019s advice for new authors. [\u2026]<\/p>\n","protected":false},"author":655,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1761,12,9,48,6],"tags":[],"class_list":["post-167653","post","type-post","status-publish","format-standard","hentry","category-cryptocurrencies","category-existential-risks","category-military","category-particle-physics","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/167653","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/655"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=167653"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/167653\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=167653"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=167653"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=167653"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}