Menu

Blog

Apr 10, 2023

Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization | Lex Fridman Podcast #368

Posted by in categories: alien life, robotics/AI

Eliezer Yudkowsky is a researcher, writer, and philosopher on the topic of superintelligent AI. Please support this podcast by checking out our sponsors:
- Linode: https://linode.com/lex to get $100 free credit.
- House of Macadamias: https://houseofmacadamias.com/lex and use code LEX to get 20% off your first order.
- InsideTracker: https://insidetracker.com/lex to get 20% off.

EPISODE LINKS:
Eliezer’s Twitter: https://twitter.com/ESYudkowsky.
LessWrong Blog: https://lesswrong.com.
Eliezer’s Blog page: https://www.lesswrong.com/users/eliezer_yudkowsky.
Books and resources mentioned:
1. AGI Ruin (blog post): https://lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities.
2. Adaptation and Natural Selection: https://amzn.to/40F5gfa.

PODCAST INFO:
Podcast website: https://lexfridman.com/podcast.
Apple Podcasts: https://apple.co/2lwqZIr.
Spotify: https://spoti.fi/2nEwCF8
RSS: https://lexfridman.com/feed/podcast/
Full episodes playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4
Clips playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOeciFP3CBCIEElOJeitOr41

OUTLINE:
0:00 — Introduction.
0:43 — GPT-4
23:23 — Open sourcing GPT-4
39:41 — Defining AGI
47:38 — AGI alignment.
1:30:30 — How AGI may kill us.
2:22:51 — Superintelligence.
2:30:03 — Evolution.
2:36:33 — Consciousness.
2:47:04 — Aliens.
2:52:35 — AGI Timeline.
3:00:35 — Ego.
3:06:27 — Advice for young people.
3:11:45 — Mortality.
3:13:26 — Love.

SOCIAL:
- Twitter: https://twitter.com/lexfridman.
- LinkedIn: https://www.linkedin.com/in/lexfridman.
- Facebook: https://www.facebook.com/lexfridman.
- Instagram: https://www.instagram.com/lexfridman.
- Medium: https://medium.com/@lexfridman.
- Reddit: https://reddit.com/r/lexfridman.
- Support on Patreon: https://www.patreon.com/lexfridman

Comments are closed.