80,000 Hours Podcast cover image

#236 – Max Harms on why teaching AI right from wrong could get everyone killed

80,000 Hours Podcast

00:00

The Evolution Analogy for Misalignment

They explore how evolution produced misaligned human goals and why training environments create proxy failures.

Play episode from 29:50
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app