
Stuff They Don't Want You To Know CLASSIC: The End Of The World with Josh Clark
5 snips
Dec 23, 2023 In this podcast, Josh Clark, cohost of Stuff You Should Know, discusses the potential end of the world and the science behind it. They explore topics such as the future of humanity, the dismissal of expertise, dangers in the biotech field, virus threats, and existential risks like the nuclear bomb.
AI Snips
Chapters
Transcript
Episode notes
How Long Humanity Could Last
- Humanity's theoretical lifespan ranges from ~1 billion years (if we stay on Earth) to potentially indefinite if we master stellar or universe engineering.
- Josh explains possibilities like moving Earth, altering the Sun's fuel usage, or growing lab universes as ways to vastly extend our future.
What Counts As An Existential Risk
- Existential risks are events that would permanently eliminate humanity or prevent recovery to our prior state.
- Josh cites Nick Bostrom and the Future of Humanity Institute as the originators of systematic existential-risk thinking.
Manmade Risks Outpace Oversight
- Existential risks include natural threats (asteroids, the Sun) and anthropogenic threats like AI, risky physics, biotech, and nanotech.
- Josh warns anthropogenic risks are actionable yet under-attended, expanding rapidly as technology progresses.

