
The Current When the law doesn't cover deepfake nudes
Mar 13, 2026
Suzie Dunn, a law professor specializing in tech and intimate-image harms, and Blair Rhodes, a CBC reporter who covered a Nova Scotia court case, discuss gaps in criminal law around AI-created nude images. They explain how aging legal definitions miss deepfakes. Conversations cover victim harms, platform regulation, civil remedies, and how future-proof laws could focus on harm and takedowns.
AI Snips
Chapters
Transcript
Episode notes
Classmates Targeted With AI Nude Images
- A Nova Scotia man scraped classmates' social media photos and used software to create realistic nude deepfakes.
- He then sent the manipulated images back to the five women, prompting charges including intimate images, harassment and obscene material.
Court Says Law Lags Behind Deepfake Technology
- Judge Brumwin Duffy ruled AI-generated nudes did not meet the criminal-code 'intimate image' definition because they were not visual recordings of real acts.
- The judge explicitly said the law lags technology and changes require Parliament, not judicial reinterpretation.
Legal Definitions Keep Missing New Harms
- Intimate-image law has long struggled with definitions and Charter challenges, dating back to a 2013 Nova Scotia case.
- Technology keeps moving the goalposts, making old statutory language inadequate for new harms.
