
Odd Lots Meet the Politician the AI Industry Is Trying to Stop
109 snips
Dec 18, 2025 Alex Bores, a New York assembly member and tech-savvy congressional candidate, discusses the explosive political landscape of AI regulation. He reveals how a $100 million super PAC is targeting him for advocating state-level regulations like the RAISE Act, aimed at ensuring safety in frontier AI. Bores also addresses the risks of AI, such as its impact on children and education, and compares its potential to dual-use technologies. His experience at Palantir adds depth to the conversation on implementing effective tech regulations in government.
AI Snips
Chapters
Books
Transcript
Episode notes
Use Concrete Thresholds To Target Risk
- The RAISE Act applies only to labs that spend at least $100 million on final training runs or significant knowledge distillation.
- Use concrete thresholds to target a few frontier players rather than broad, economy-wide rules.
Distillation As A Geopolitical Vector
- Bores highlights knowledge distillation as the route China uses to catch up without massive compute.
- Regulating distillation filters potential geopolitical loopholes that outright compute limits miss.
Test For Harmful Behavior, Not Intent
- Test models by observing impacts and behaviors in compromised scenarios rather than trying to infer intent.
- Remove dangerous behaviors (like extortion-like outputs) before release using targeted behavioral tests.




