
Command and Control Promethean Shame
Apr 26, 2026
Elka Schwartz, an academic on military AI ethics and author of Death Machines, warns about machines displacing human agency in targeting. She discusses automation and action biases, case studies of AI in combat, the idea of Promethean shame, and why speed can undercut sound strategic judgment. Short, urgent reflections on reclaiming human control over lethal technologies.
AI Snips
Chapters
Books
Transcript
Episode notes
AI Scales Targeting Beyond Human Judgment
- Epic Fury showed AI systems massively scale targeting, identifying thousands of targets (11,000 reported) and accelerating the kill chain.
- Dr Elka Schwartz warns this scaling marginalizes human agency, creating quasi-autonomous workflows where humans just confirm machine outputs.
Distributed Systems Erode Legal And Moral Agency
- Human agency presumes knowledge, time, scrutiny and responsibility, all undermined when AI distributes decision-making across systems.
- Schwartz explains laws and accountability assume human fallibility, which breaks when agency is diffused into opaque AI chains.
Morality Cannot Be Fully Encoded In AI
- You cannot code morality into a machine because morality depends on human meaning, vulnerability and mutual understanding.
- Schwartz argues machines lack comprehension of pain, social bonds and the political effects of harm, so ethical rules lose meaning when automated.

