Kill The Computer

Preview: Max Resist Ft. Charles Star

Apr 5, 2026
Charles Star, legal analyst and podcaster known for A-Lab, tackles wild legal stories. He previews cases where AI-sourced research got lawyers sanctioned. They unpack a Supreme Court fight over conversion therapy bans and explore how narrow rulings can reshape law. The conversation also tracks a global database of AI-related attorney mishaps and why hallucinations are dangerous in legal briefs.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Narrow Plaintiffs Let Courts Convert Medical Acts Into Protected Speech

  • The Childs v. Salazar discussion shows how framing a plaintiff as narrowly situated lets courts recast medical practice as protected speech.
  • That rhetorical tactic can incrementally expand First Amendment protections into areas states regulate as medical practice.
INSIGHT

Small Rulings Can Build A Staircase To Broader Rights Erosion

  • Even seemingly narrow Supreme Court rulings can be stepping stones; a narrow opinion today can enable broader erosion of rights later as Court composition changes.
  • Charles warns the conservative incremental approach builds a staircase of precedent.
ANECDOTE

ChatGPT Fabricated Case Law And Sank A Lawsuit

  • Charles Starr recounted the Mata v. Avianca case where a lawyer used ChatGPT to generate legal research and the model fabricated eleven case citations.
  • The plaintiff sued after two years; ChatGPT-created cases falsely purported to toll the statute of limitations and triggered court sanctions when exposed.
Get the Snipd Podcast app to discover more snips from this episode
Get the app