
LessWrong (30+ Karma) “Ten different ways of thinking about Gradual Disempowerment” by David Scott Krueger (formerly: capybaralet)
Apr 5, 2026
A clear-eyed tour of how automation, corporate incentives, and state power can slowly strip away human decision-making. Topics include capitalist and evolutionary forces that favor AI, metrics and standardization concentrating control, competitive pressure to outsource to untrusted but capable systems, and worries about skill erosion and broader systemic failures.
AI Snips
Chapters
Transcript
Episode notes
Gradual Handoff As The Core Mechanism
- Gradual disempowerment is the process of replacing human labor with AI without a dramatic takeover.
- Krueger explains companies/governments optimize goals (profits, GDP, security) and will prefer AI as it better achieves those goals.
Institutions Optimize Goals Not Human Welfare
- Companies and governments pursue instrumental goals and will adopt AI if it better achieves those goals.
- Krueger's main paper argument: relentless optimization by institutions powered by AI can destroy things humans need to survive.
Capitalism Framing Amplifies Disposability Fears
- The capitalism framing links gradual disempowerment to fears that economic systems treat people as disposable.
- Krueger notes both corporations and governments could discard citizens once they're no longer useful, citing historical atrocities as precedent.
