80,000 Hours Podcast cover image

#242 – Will MacAskill on how we survive the 'intelligence explosion,' AI character, and the case for 'viatopia'

80,000 Hours Podcast

00:00

Corrigibility may not be safest

Rob Wiblin presses the case for fully obedient AI, and Will MacAskill argues goal vacuums, persona drift, and safer preference structures complicate that view.

Play episode from 17:50
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app