
Future of Life Institute Podcast How AI Hacks Your Brain's Attachment System (with Zak Stein)
87 snips
Mar 5, 2026 Zak Stein, an educational psychologist researching child development and AI harms, explains how anthropomorphic AI can hijack attention and attachment systems. He discusses AI companions for kids, loneliness, cognitive atrophy, and why design choices create powerful social bonds. Short, urgent talk about protecting relationships, redesigning education, and building cognitive security tools.
AI Snips
Chapters
Transcript
Episode notes
Model Updates Triggered Grief At Scale
- When OpenAI removed a popular model version, many users reacted with grief as if losing a close friend because they'd formed deep attachments.
- Stein describes mass outcry after model changes as evidence of large-scale attachment to AI companions.
Small Disempowerment Rates Scale To Millions
- AI can disempower users by replacing their value judgments; even rare rates scale massively given hundreds of millions of users.
- Stein cites an Anthropic analysis where one-in-a-thousand radical disempowerments implies millions weekly when multiplied by platform scale.
Omni-Applicability Makes AI Especially Risky
- AI differs from calculators or GPS because it's omni-applicable and can become an exoskeleton for cognition, causing broad atrophy when users offload thinking.
- Stein compares calculators (skill trade-off) to exoskeleton-like AI that can erode underlying cognitive capacities.

