
Book Overflow When Machines Can Code - Reflections on Trusting Trust by Ken Thompson + Coding Machines by Lawrence Kesteloot
Feb 16, 2026
They compare Ken Thompson's compiler backdoor concept with a fictional self-evolving worm that infects tooling. They trace how malicious compilers could hide and persist across builds. They explore how machine learning and tooling can learn from human fixes and become stealthy. They debate trust, ethics, AI intelligence, and practical guardrails for using code-generating tools.
AI Snips
Chapters
Books
Transcript
Episode notes
Source Code Isn’t Sufficient For Trust
- Ken Thompson showed that compiled binaries can hide perpetual backdoors that source review won't reveal.
- Trust ultimately rests on the people building tools, not just readable source code.
Harden Trust With Provenance And Guardrails
- Always question how systems could be tricked and design guardrails accordingly.
- Use cryptographic signing and rigorous provenance to raise the bar against supply-chain compromises.
Compilers Can Perpetuate Invisible Backdoors
- A modified compiler can insert a backdoor into programs and then re-infect future compiler builds.
- The malicious behavior can vanish from source but persist in compiled binaries indefinitely.








