Law Report

How accurate is facial recognition software?

Mar 31, 2026
Associate Professor Catherine Kemp, a law and tech policy leader, outlines legal and privacy concerns with facial recognition. Alvi Chowdhury, a software engineer wrongly arrested after a match, shares his personal account. They discuss wrongful arrests, racial bias and error rates, retailers' use of real-time systems, failures in notification and oversight, and calls for stronger limits and human review.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ANECDOTE

Wrong Arrest Caused By A Flawed Match

  • Alvi Chowdhury was wrongly arrested after a UK police facial recognition match linked his four‑year‑old custody photo to CCTV from a burglary in Milton Keynes.
  • He had alibi evidence, was 150 km away, and only discovered his custody photo remained on police systems despite an earlier NFA.
INSIGHT

Facial Recognition Has Higher Error Rates For Minorities

  • The UK Home Office research shows markedly higher error rates for ethnic minorities: about 4% for South Asian males and 9% for Black females.
  • Alvi says police still deploy systems with error rates tech companies would call catastrophic.
ANECDOTE

Retailer Used Real‑Time Scanning Against Offender List

  • Bunnings used real‑time facial recognition across about 60 stores to screen customers against a list of repeat offenders and claimed nonmatches were deleted almost immediately.
  • If matched, staff reviewed captured images to decide whether to remove the person.
Get the Snipd Podcast app to discover more snips from this episode
Get the app