Better Offline

Monologue: AI's Compute Demand Story Is A Lie

116 snips
May 1, 2026
A sharp takedown of how a few companies are hoarding cloud GPU capacity and skewing AI economics. A breakdown of who is likely consuming most hyperscaler AI spend. A look at the VC-hyperscaler loop that props up overpriced infrastructure. A tour of billing quirks, inventory oddities, and what could trigger a broader reckoning.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

OpenAI Dominates Hyperscaler AI Revenue

  • AI revenue figures from hyperscalers hide where demand actually sits.
  • Ed Zitron estimates OpenAI spent ~$18B+ on Azure in 2025, making OpenAI ~70% of Microsoft's AI revenue and consuming most infrastructure.
INSIGHT

Amazon's AI Revenue Is Mostly Anthropic

  • Amazon's reported AI run rate is tiny relative to its CapEx spend.
  • Andy Jassy said AWS AI run rate ~$15B, but Ed argues much of that revenue likely comes from Anthropic, which he estimates drives ~75–80% of AWS AI revenue.
INSIGHT

Compute Shortages Driven By Monopolized Capacity

  • Capacity shortages are caused by a few tenants hogging compute, not universal demand.
  • Epoch AI estimates Microsoft ~2.02GW and OpenAI ~1.9GW access, implying OpenAI takes 80–90% of Microsoft's capacity.
Get the Snipd Podcast app to discover more snips from this episode
Get the app