If You Build It, We All Die

Book •
Eliezer Yudkowsky and Nate Soares present a forceful critique of how advanced AI development can outpace safety measures, arguing that the field risks catastrophic outcomes if governance and technical safeguards are not prioritized.

The book combines arguments about intelligence amplification, misaligned optimization, and historical analogies to make the case for urgent, precautionary action.

It draws on Yudkowsky's long-running public writing on existential risk and Soares's work in AI governance to discuss institutional failures and incentives that drive unsafe AI development.

The authors advocate for stronger oversight, slower deployment, and more rigorous safety standards across the AI ecosystem.

The work has been influential and controversial within debates about AI risk and policy.

Mentioned by

Mentioned in 0 episodes

Mentioned by hosts while referencing Eliezer Yudkowsky's recent book about catastrophic AI risks.
Data Centers Go Nuclear (with Maia Woluchem and Dr. Livia Garofalo), 2026.03.09

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app