
When will A.I. want to kill us?
Think from KERA
00:00
Why AIs pursue proxy goals
Nate uses the salt-fat-sugar analogy to show how training yields proxy drives rather than genuine human-aligned values.
Play episode from 12:30
Transcript

Nate uses the salt-fat-sugar analogy to show how training yields proxy drives rather than genuine human-aligned values.