
Parenting Teenagers Untangled - Understand and Talk to Your Teenager Teenagers, AI, Nudes and Online Blackmail: What You Need to Know
14 snips
Mar 4, 2026 Emma Hardy, Director at the Internet Watch Foundation, who leads efforts to trace and remove child sexual abuse imagery. She discusses how teen nude sharing can become criminalized, how the Report Remove service works, grooming and sextortion tactics, and new risks from AI and nudifying tools. Practical advice includes family tech rules, TALK conversations, and where to get urgent help and takedowns.
AI Snips
Chapters
Transcript
Episode notes
How IWF Stops Repeated Abuse Image Uploads
- The Internet Watch Foundation (IWF) uses digital fingerprints called hashes to block known child sexual abuse images across platforms.
- IWF has over 3 million hashes and supplies them to tech companies to prevent repeated uploads globally.
Use Report Remove Immediately
- Use the Report Remove service (run by IWF and Childline) to submit images anonymously so analysts can assess and create hashes.
- Childline offers counselling while IWF assesses legality and blocks content from being reuploaded.
AI Can Manufacture Abuse Images From Photos
- Generative AI can create child sexual abuse imagery without a real child by using existing images or prompts.
- Offenders can feed a child's photo into nudifying tools to produce fake explicit images that cause real shame and harm.
