
The Weekly Show with Jon Stewart Silicon Valley Goes to War
187 snips
Mar 11, 2026 Paul Scharre, Pentagon policy veteran now at CNAS, offers deep expertise on autonomous weapons. Dr. Sarah Shoker, UC Berkeley researcher and former OpenAI geopolitics lead, studies generative AI in warfare. They discuss how AI is woven into targeting and logistics. They probe tech firms’ ties to the military, opacity of contracts, governance options, escalation risks, and hardware as a policy lever.
AI Snips
Chapters
Transcript
Episode notes
Military Sees AI As A Productivity Tool
- Militaries treat AI as a productivity tool rather than a unique alien technology.
- Paul Scharre explains armed forces use AI for backend tasks like logistics and analysts' workflows as well as for narrow battlefield tasks like imagery analysis.
What Counts As An Autonomous Weapon System
- Autonomous weapon systems are defined as weapons that can select and engage targets without human intervention.
- Sarah Shoker cites the U.S. DOD definition and notes a human can be in the loop but isn't required under that definition.
Claude Integrated Into Palantir's Maven For Targeting
- Claude from Anthropic was integrated into Palantir's Maven smart system to make disparate sensor and satellite data more readable to analysts.
- Sarah Shoker reports public reporting links Claude to target selection and prioritization in Iran via Maven.


