80,000 Hours Podcast cover image

#237 – Robert Long on how we're not ready for AI consciousness

80,000 Hours Podcast

00:00

When full alignment might be justified

They consider the emergency argument for full alignment to avoid catastrophic risks.

Play episode from 23:06
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app