
The Daily AI Show The Smoking Gun Conundrum
7 snips
Mar 21, 2026 They debate how advanced AI can break traditional chains of blame and make harms hard to trace. Stories of self-driving crashes and moral crumple zones illustrate where people absorb responsibility. The conversation covers joint liability schemes, insurance pools, and the risk that opacity creates fictional blame. They also explore proposals like AI legal personhood and tensions between compensation and the need for justice.
AI Snips
Chapters
Books
Transcript
Episode notes
When AI Destroys The Smoking Gun
- Advanced AI breaks the historical traceability of fault because systems adapt, self-tune, and change behavior in live environments.
- That dissolves the clean causal chain investigators used to find a single human or design error, leaving harms unattributable.
Hold Institutions Accountable To Preserve Deterrence
- Preserve institutional liability to maintain deterrence by holding builders and deployers financially responsible.
- Use joint liability and pooled insurance so victims get compensation and companies internalize safety incentives.
Uber Safety Driver Became The Moral Crumple Zone
- The 2018 Uber Tempe crash shows institutional failures but prosecutors charged the low‑paid safety driver, not the corporation.
- Uber had disabled Volvo emergency braking and deployed immature software, yet the human observer faced negligent homicide charges.



