
Big Brains Could Data Centers Break Our Power Grid? with Andrew Chien
15 snips
Mar 6, 2026 Andrew Chien, a University of Chicago computer scientist who studies large-scale and sustainable computing. He explains why data centers concentrate compute and how AI drives huge energy needs. He describes factory-like facilities, cooling and water challenges, grid stress risks, and ideas like using surplus renewables to lower carbon.
AI Snips
Chapters
Transcript
Episode notes
Why AI Models Are Exceptionally Energy Hungry
- Large AI models consume massive compute because they perform billions to trillions of arithmetic operations per token prediction.
- Chien notes modern large language models have hundreds of billions of weights, so each generated word can require ~100 billion to a trillion ops, drastically increasing energy needs.
AI Could Rapidly Increase National Electricity Demand
- AI-driven compute growth is outpacing carbon reduction progress and threatens to reverse gains made over the past decade.
- Chien warns computing's rapid power growth could push computing to 8–10% of U.S. electricity by 2030 and possibly 20–25% by 2035 if trends continue.
How Chien Realized Data Centers Would Be A Power Problem
- Chien recounts his transition from Intel VP to academia noticing the end of Dennard scaling and Moore's Law.
- That realization in 2011 led him to focus on data center power, predicting a ‘big power problem’ and starting related research around 2015.
