{"id":207624,"date":"2025-03-02T15:08:04","date_gmt":"2025-03-02T21:08:04","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2025\/03\/why-a-global-federation-might-be-needed-to-manage-ai"},"modified":"2025-03-02T15:08:04","modified_gmt":"2025-03-02T21:08:04","slug":"why-a-global-federation-might-be-needed-to-manage-ai","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2025\/03\/why-a-global-federation-might-be-needed-to-manage-ai","title":{"rendered":"Why a global federation might be needed to manage AI"},"content":{"rendered":"<p><\/p>\n<p><iframe style=\"display: block; margin: 0 auto; width: 100%; aspect-ratio: 4\/3; object-fit: contain;\" src=\"https:\/\/www.youtube.com\/embed\/RG9Iuz4Zck0?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope;\n   picture-in-picture\" allowfullscreen><\/iframe><\/p>\n<p>Rufo Guerreschi.<br \/>\n<a href=\"https:\/\/www.linkedin.com\/in\/rufoguerreschi\">https:\/\/www.linkedin.com\/in\/rufoguerreschi<\/a>.<\/p>\n<p>Coalition for a Baruch Plan for AI<br \/>\n<a href=\"https:\/\/www.cbpai.org\/\">https:\/\/www.cbpai.org\/<\/a><\/p>\n<p>0:00 Intro.<br \/> 0:21 Rufo Guerreschi.<br \/> 0:28 Contents.<br \/> 0:41 Part 1: Why we have a governance problem.<br \/> 1:18 From e-democracy to cybersecurity.<br \/> 2:42 Snowden showed that international standards were needed.<br \/> 3:55 Taking the needs of intelligence agencies into account.<br \/> 4:24 ChatGPT was a wake up moment for privacy.<br \/> 5:08 Living in Geneva to interface with states.<br \/> 5:57 Decision making is high up in government.<br \/> 6:26 Coalition for a Baruch plan for AI<br \/> 7:12 Parallels to organizations to manage nuclear safety.<br \/> 8:11 Hidden coordination between intelligence agencies.<br \/> 8:57 Intergovernmental treaties are not tight.<br \/> 10:19 The original Baruch plan in 1946<br \/> 11:28 Why the original Baruch plan did not succeed.<br \/> 12:27 We almost had a different international structure.<br \/> 12:54 A global monopoly on violence.<br \/> 14:04 Could expand to other weapons.<br \/> 14:39 AI is a second opportunity for global governance.<br \/> 15:19 After Soviet tests, there was no secret to keep.<br \/> 16:22 Proliferation risk of AI tech is much greater?<br \/> 17:44 Scale and timeline of AI risk.<br \/> 19:04 Capabilities of security agencies.<br \/> 20:02 Internal capabilities of leading AI labs.<br \/> 20:58 Governments care about impactful technologies.<br \/> 22:06 Government compute, risk, other capabilities.<br \/> 23:05 Are domestic labs outside their jurisdiction?<br \/> 23:41 What are the timelines where change is required?<br \/> 24:54 Scientists, Musk, Amodei.<br \/> 26:24 Recursive self improvement and loss of control.<br \/> 27:22 A grand gamble, the rosy perspective of CEOs.<br \/> 28:20 CEOs can\u2019t really say anything else.<br \/> 28:59 Altman, Trump, Softbank pursuing superintelligence.<br \/> 30:01 Superintelligence is clearly defined by Nick Bostrom.<br \/> 30:52 Explain to people what \u201csuperintelligence\u201d means.<br \/> 31:32 Jobs created by Stargate project?<br \/> 32:14 Will centralize power.<br \/> 33:33 Sharing of the benefits needs to be ensured.<br \/> 34:26 We are running out of time.<br \/> 35:27 Conditional treaty idea.<br \/> 36:34 Part 2: We can do this without a global dictatorship.<br \/> 36:44 Dictatorship concerns are very reasonable.<br \/> 37:19 Global power is already highly concentrated.<br \/> 38:13 We are already in a surveillance world.<br \/> 39:18 Affects influential people especially.<br \/> 40:13 Surveillance is largely unaccountable.<br \/> 41:35 Why did this machinery of surveillance evolve?<br \/> 42:34 Shadow activities.<br \/> 43:37 Choice of safety vs liberty (privacy)<br \/> 44:26 How can this dichotomy be rephrased?<br \/> 45:23 Revisit supply chains and lawful access.<br \/> 46:37 Why the government broke all security at all levels.<br \/> 47:17 The encryption wars and export controls.<br \/> 48:16 Front door mechanism replaced by back door.<br \/> 49:21 The world we could live in.<br \/> 50:03 What would responding to requests look like?<br \/> 50:50 Apple may be leaving \u201cbug doors\u201d intentionally.<br \/> 52:23 Apple under same constraints as government.<br \/> 52:51 There are backdoors everywhere.<br \/> 53:45 China and the US need to both trust AI tech.<br \/> 55:10 Technical debt of past unsolved problems.<br \/> 55:53 Actually a governance debt (social-technical)<br \/> 56:38 Provably safe or guaranteed safe AI<br \/> 57:19 Requirement: Governance plus lawful access.<br \/> 58:46 Tor, Signal, etc are often wishful thinking.<br \/> 59:26 Can restructure incentives.<br \/> 59:51 Restrict proliferation without dragnet?<br \/> 1:00:36 Physical plus focused surveillance.<br \/> 1:02:21 Dragnet surveillance since the telegraph.<br \/> 1:03:07 We have to build a digital dog.<br \/> 1:04:14 The dream of cyber libertarians.<br \/> 1:04:54 Is the government out to get you?<br \/> 1:05:55 Targeted surveillance is more important.<br \/> 1:06:57 A proper warrant process leveraging citizens.<br \/> 1:08:43 Just like procedures for elections.<br \/> 1:09:41 Use democratic system during chip fabrication.<br \/> 1:10:49 How democracy can help with technical challenges.<br \/> 1:11:31 Current world: anarchy between countries.<br \/> 1:12:25 Only those with the most guns and money rule.<br \/> 1:13:19 Everyone needing to spend a lot on military.<br \/> 1:14:04 AI also engages states in a race.<br \/> 1:15:16 Anarchy is not a given: US example.<br \/> 1:16:05 The forming of the United States.<br \/> 1:17:24 This federacy model could apply to AI<br \/> 1:18:03 Same idea was even proposed by Sam Altman.<br \/> 1:18:54 How can we maximize the chances of success?<br \/> 1:19:46 Part 3: How to actually form international treaties.<br \/> 1:20:09 Calling for a world government scares people.<br \/> 1:21:17 Genuine risk of global dictatorship.<br \/> 1:21:45 We need a world \/federal\/ democratic government.<br \/> 1:23:02 Why people are not outspoken.<br \/> 1:24:12 Isn\u2019t it hard to get everyone on one page?<br \/> 1:25:20 Moving from anarchy to a social contract.<br \/> 1:26:11 Many states have very little sovereignty.<br \/> 1:26:53 Different religions didn\u2019t prevent common ground.<br \/> 1:28:16 China and US political systems similar.<br \/> 1:30:14 Coming together, values could be better.<br \/> 1:31:47 Critical mass of states.<br \/> 1:32:19 The Philadelphia convention example.<br \/> 1:32:44 Start with say seven states.<br \/> 1:33:48 Date of the US constitutional convention.<br \/> 1:34:42 US and China both invited but only together.<br \/> 1:35:43 Funding will make a big difference.<br \/> 1:38:36 Lobbying to US and China.<br \/> 1:38:49 Conclusion.<br \/> 1:39:33 Outro<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Rufo Guerreschi. https:\/\/www.linkedin.com\/in\/rufoguerreschi. Coalition for a Baruch Plan for AI https:\/\/www.cbpai.org\/ 0:00 Intro. 0:21 Rufo Guerreschi. 0:28 Contents. 0:41 Part 1: Why we have a governance problem. 1:18 From e-democracy to cybersecurity. 2:42 Snowden showed that international standards were needed. 3:55 Taking the needs of intelligence agencies into account. 4:24 ChatGPT was a wake up [\u2026]<\/p>\n","protected":false},"author":661,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34,1878,1625,5,1759,9,873,6,1511,22],"tags":[],"class_list":["post-207624","post","type-post","status-publish","format-standard","hentry","category-cybercrime-malcode","category-employment","category-encryption","category-geopolitics","category-governance","category-military","category-nuclear-energy","category-robotics-ai","category-surveillance","category-treaties"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/207624","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/661"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=207624"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/207624\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=207624"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=207624"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=207624"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}