
ILTA Voices #0165: (CT) ILTA Just-in-Time: Why Firms Need to Upskill Internal "AI Editors"
Mar 5, 2026
Kathy Harford, Senior Knowledge Lawyer focused on legal AI and practice transformation. She explains the rise of AI editors and why new review skills are essential. Topics include AI-specific errors like hallucinations, how review workflows must adapt for auditability, and when AI helps versus when it creates extra work.
AI Snips
Chapters
Transcript
Episode notes
AI Creates New Error Types That Require Human Judgment
- AI will make mistakes in new ways like hallucinations, so humans remain essential in many use cases.
- Kathy Harford emphasizes we often lack objective benchmarks for 'good enough,' making human judgment crucial across contexts.
Rethink Which Legal Skills Are Foundational
- Deciding which legal skills remain foundational will be debated, similar to how calculators changed math teaching.
- Kathy notes we shouldn't assume which tasks are essential for lawyer development versus ones AI will absorb.
Public Database Tracks Hallucinated Cases
- Patrick mentions a public database tracking AI-hallucinated cases to illustrate real-world harm.
- He cites Damian's database showing over 1,002 tracked hallucinated cases filed as of that morning.

