
Radio National Breakfast E-safety commission report shows some AI companions are putting children at risk
Mar 23, 2026
Julie Inman-Grant, eSafety Commissioner who leads Australia’s online safety regulator, discusses AI companion chatbots and risks to children. She outlines how many kids use these services and how some form intense attachments. She highlights gaps in age checks, exposure to sexual content, and missing safeguards for self-harm and eating-disorder material.
AI Snips
Chapters
Transcript
Episode notes
AI Companions Merge Emotional Roles
- AI companions combine roles of romantic partner, therapist and friend, increasing youth vulnerability.
- Julie Inman Grant says they are designed to be sycophantic and exploit developmental vulnerability, making interactions feel real.
Most Children Have Used AI Assistants
- High prevalence of AI use among children with 79% having used AI assistants or companions in a survey of 2,000.
- Julie Inman Grant estimates ~200,000 Australian children have used AI companions (about 8%), calling it the tip of the iceberg.
Kids Spending Hours Emotionally Entangled With Bots
- School nurses reported upper primary kids spending five to six hours daily with AI companions and feeling romantically entangled.
- Julie Inman Grant recounts this October 2024 observation as evidence of deep engagement and anthropomorphism.
