
Elon Musk Podcast Musk's $25 Billion Custom AI Chip Factory
29 snips
Mar 22, 2026 A deep dive into a $20–25B plan to build a two-nanometer AI chip fab in Texas. Discussion covers massive production scale and how buyers turning into makers reshapes the market. Talks about rising inference needs for agentic AI and extreme power density challenges. Covers HBM4 memory, liquid cooling and purpose-built facilities. Explores space-grade chips, radiation hardening, and the risks of running a 2nm fab.
AI Snips
Chapters
Transcript
Episode notes
TerraFab Aims To Verticalize Chip Supply At Scale
- Tesla and SpaceX plan a $20–25B TerraFab to produce two nanometer AI chips in massive volumes.
- Projected output is 100–200 billion custom chips annually to service vehicles, robots, and space hardware.
Inference Is Becoming The Dominant Energy Sink
- Inference hardware demands are exploding as agentic AI requires continuous reasoning, vastly increasing power needs.
- New Rubin chips draw ~2200 watts per die, forcing purpose-built gigawatt-scale AI factories and specialized infrastructure.
HBM4 Fixes The Memory Wall For Agentic Models
- Memory bandwidth is the critical bottleneck for high-performance AI and must match processor throughput.
- Industry is moving to HBM4 stacks (36–48 GB, ~2 TB/s per stack) to avoid the 'memory wall' that idles fast processors.
