BlueDot Narrated cover image

What is AI Alignment?

BlueDot Narrated

00:00

Outer alignment: reward misspecification

Perrin Walker describes reward misspecification and gives examples like LLMs rewarded for convincing answers.

Play episode from 12:41
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app