
The Quanta Podcast Audio Edition: How Can AI Researchers Save Energy? By Going Backward.
16 snips
Jan 22, 2026 Discover how reversible programs could revolutionize energy use in AI by running backward. Learn about Michael Frank's transition from AI to efficiency research and the link between entropy and information. Delve into Landauer's principle revealing the heat costs of deletion and Bennett's concept of uncomputation. Explore challenges of creating reversible hardware and the renewed interest sparked by chip-scaling limits. Finally, see how parallel processing with reversible chips could lead to significant energy savings for AI applications.
AI Snips
Chapters
Transcript
Episode notes
Computation Linked To Thermodynamic Loss
- Reversible computers avoid heat from information deletion by never discarding data.
- This leverages thermodynamics: deleting bits forces electrons into unknown states that emit heat.
Research Shift Driven By Energy Concerns
- Michael Frank switched from AI to studying physical limits of computation after worrying about energy use.
- He found reversible computing as a candidate to reduce energy waste by running computations backward.
Uncomputation Recovers Information Without Heat
- Bennett's uncomputation lets you run computations forward, store outputs, then run backward to erase intermediate data.
- This preserves information so no energy is lost to deletion, in principle avoiding heat dissipation.
