Toggle light / dark theme

Optimal Timing for Superintelligence: Mundane Considerations for Existing People

Nick Bostrom argues the case for doing the opposite of what Eliezer Yudkowsky recommends with regard to Artificial Intelligence. Yudkowsky says if anyone builds strong AI, everyone dies. To the contrary, Bostrom argues that if no one builds strong Artificial General Intelligence, everyone dies.

Leave a Comment

Lifeboat Foundation respects your privacy! Your email address will not be published.

/* */