CNN 5 Things

One Thing: Is This Social Media's Big Tobacco Moment?

Mar 29, 2026
Clare Duffy, CNN tech reporter and podcaster who explains legal and product-design implications. Caroline Koziol, a college student who sued platforms after developing an eating disorder she links to algorithmic content. They discuss the LA verdict finding platforms liable, algorithm-driven harm and autoplay, internal company documents, legal strategies to hold platforms accountable, and potential product and policy changes.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ANECDOTE

How Algorithmic Feeds Fueled One Student's Eating Disorder

  • Caroline Koziol developed anorexia after Instagram and TikTok feeds pushed extreme dieting and exercise content during COVID lockdowns.
  • She lost 30 pounds in a year, fainted at swim practice, missed her senior season, and later found peers in treatment saw the same algorithmic content.
INSIGHT

Jury Holds Platforms Liable Over Design Causing Harm

  • A Los Angeles jury found Meta and YouTube liable for addicting a young woman and causing mental health harms, awarding $6 million in damages.
  • Ten of 12 jurors agreed on negligence tied to product design, signaling potential legal exposure beyond Section 230 defenses.
ADVICE

Curate Your Feed To Limit Harmful Content

  • Users can actively curate feeds by blocking creators, keywords, and hashtags and marking content 'not interested' to reduce harmful recommendations.
  • Caroline still uses social media but blocks weight-loss creators and keywords, and avoids engaging with triggering videos.
Get the Snipd Podcast app to discover more snips from this episode
Get the app