Toggle light / dark theme

Sorry Mr. Yudkowsky, we’ll build it and everything will be fine

Review of “If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All” (2025), by Eliezer Yudkowsky and Nate Soares, with very critical commentary.

I’be been reading the book “If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All” (2025), by Eliezer Yudkowsky and Nate Soares, published last week.

Yudkowsky and Soares present a stark warning about the dangers of developing artificial superintelligence (ASI), defined as artificial intelligence (AI) that vastly exceeds human intelligence. The authors argue that creating such AI using current techniques would almost certainly lead to human extinction and emphasize that ASI poses an existential threat to humanity. They argue that the race to build smarter-than-human AI is not an arms race but a “suicide race,” driven by competition and optimism that ignores fundamental risks.

Leave a Comment

Lifeboat Foundation respects your privacy! Your email address will not be published.

/* */