Humans + AI

Cornelia C. Walther on AI for Inspired Action, return on values, prosocial AI, and the hybrid tipping zone (AC Ep35)

Mar 12, 2026
Cornelia Walther, UN humanitarian leader turned academic and director of POZE, explores the hybrid tipping zone where humans and AI meet. She discusses agency decay, the need for double literacy (natural and algorithmic), the A-frame for mindful AI use, and what pro-social AI and return on values look like in practice.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Agency Decay Threatens Human Autonomy

  • Agency decay is individuals delegating thinking, feeling, and doing to AI, risking loss of control and appetite for self-action.
  • Walther locates society on a curve from experimentation to integration and warns we're near reliance and addiction without intervention.
INSIGHT

Values In Values Out Reframes AI Design

  • AI is neutral and becomes what we design it to be: moving from 'garbage in, garbage out' to 'values in, values out' by clarifying desired values offline first.
  • Walther emphasizes starting with the values you want amplified before encoding them in systems.
ADVICE

Teach Double Literacy Alongside AI Skills

  • Build double literacy: teach natural intelligence (NI) and algorithmic intelligence (AI) side-by-side to understand their complementarity and mutual influence.
  • Walther says this prevents slipping from integration into reliance by making people aware of both strengths and gaps.
Get the Snipd Podcast app to discover more snips from this episode
Get the app