Toggle light / dark theme

In this episode, we return to the subject of existential risks, but with a focus on what actions can be taken to eliminate or reduce these risks.

Our guest is James Norris, who describes himself on his website as an existential safety advocate. The website lists four primary organizations which he leads: the International AI Governance Alliance, Upgradable, the Center for Existential Safety, and Survival Sanctuaries.

Previously, one of James’ many successful initiatives was Effective Altruism Global, the international conference series for effective altruists. He also spent some time as the organizer of a kind of sibling organization to London Futurists, namely Bay Area Futurists. He graduated from the University of Texas at Austin with a triple major in psychology, sociology, and philosophy, as well as with minors in too many subjects to mention.

Selected follow-ups:

• James Norris website (https://www.jamesnorris.org/)
• Upgrade your life & legacy (https://www.upgradable.org/) — Upgradable.
• The 7 Habits of Highly Effective People (https://www.franklincovey.com/courses… (Stephen Covey)
• Beneficial AI 2017 (https://futureoflife.org/event/bai-2017/) — Asilomar conference.
• \

Leave a Comment

If you are already a member, you can use this form to update your payment info.

Lifeboat Foundation respects your privacy! Your email address will not be published.