
There Auto Be A Law Waymo's Failed U-turn: A Spin on an Emergency
Mar 5, 2026
A deep dive into Waymo's recent safety stumbles, including a school-bus pass and blocking first responders. A look at how remote human interventions complicate automated driving and liability. Hidden destination fees in car buying get called out. International rules and policing tactics raise questions about who keeps roads safe.
AI Snips
Chapters
Transcript
Episode notes
AVs Risk Learning Unsafe Human Habits
- Autonomous vehicles may mimic unsafe human behavior when remote agents or perception misinterpret road cues.
- Michael Brooks warns we don't want AVs copying illegal passes of stopped school buses as a norm.
Include Remote Operators In AV Investigations
- Investigations of AV incidents must include remote operators and human decision processes, not just vehicle software logs.
- NTSB can recommend changes but NHTSA must enforce rules and examine remote-assistance access and training.
Human Backup Architecture Has Repeated Failures
- Waymo's safety architecture relies on human backup that has repeatedly failed in real-world incidents.
- Examples include school-bus errors, Austin blocking ambulances, and San Francisco stoppages during outages.
