#312
Mentioned in 85 episodes

If Anyone Builds It, Everyone Dies

Book • 2025
This book delves into the potential risks of advanced artificial intelligence, arguing that the development of superintelligence could lead to catastrophic consequences for humanity.

The authors present a compelling case for the need for careful consideration and regulation of AI development.

They explore various scenarios and potential outcomes, emphasizing the urgency of addressing the challenges posed by rapidly advancing AI capabilities.

The book is written in an accessible style, making complex ideas understandable to a broad audience.

It serves as a call to action, urging policymakers and researchers to prioritize AI safety and prevent potential existential threats.

Mentioned by

Mentioned in 85 episodes

Mentioned by
undefined
Cal Newport
to introduce Eliezer Yudkowsky as a co-author and AI apocalypse warner.
1,396 snips
Ep. 377: The Case Against Superintelligence
Mentioned by
undefined
Sam Parr
who says that he has not read it, although he does not plan to.
701 snips
Story Of The Most Important Founder You've Never Heard Of
Mentioned by
undefined
Sam Harris
as the upcoming book by
undefined
Eliezer Yudkowsky
and
undefined
Nate Soares
on the dangers of superhuman AI.
381 snips
#434 — Can We Survive AI?
Mentioned by
undefined
Andy Mills
as a book co-authored by
undefined
Nate Soares
and Eliezer Yudkowsky, about the dangers of superintelligence.
375 snips
EP 6: The AI Doomers
Recommended as a fascinating lesson and extreme case for fear around AI.
242 snips
500. Japan, China, and the Fight for Taiwan (Question Time)
Mentioned by
undefined
Nathan Labenz
as the work of Eliezer Yudkaski, known as the prophet of AI doom.
183 snips
What AI Means for Students & Teachers: My Keynote from the Michigan Virtual AI Summit
Mentioned by
undefined
Blaise Aguera y Arcas
as a contrast to his own concerns, referring to Eliezer Yudkowsky's views on AI.
143 snips
Google Researcher Shows Life "Emerges From Code" - Blaise Agüera y Arcas
Mentioned by
undefined
Tristan Harris
as a provocative title by Eliezer Yudkowsky regarding the dangers of AI.
124 snips
Feed Drop: "Into the Machine" with Tobias Rose-Stockwell
Mentioned by
undefined
Nate Hagens
as a book recently published on the risks of artificial superintelligence.
88 snips
If Anyone Builds It, Everyone Dies: How Artificial Superintelligence Might Wipe Out Our Entire Species with Nate Soares
Mentioned by
undefined
Liron Shapira
when comparing his position to Eliezer Yudkowsky's.
83 snips
David Deutschian vs. Eliezer Yudkowskian Debate: Will AGI Cooperate With Humanity? — With Brett Hall

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app