Odd Lots

Ray Wang on How AI Is Causing DRAM Prices to Surge

232 snips
Feb 16, 2026
Ray Wang, analyst at SemiAnalysis and author of Memory Mania, breaks down the sudden DRAM and HBM rush sparked by AI. He explains why AI workloads slurp memory, how HBM differs from commodity DRAM, and why suppliers are struggling to scale. Short-term fixes, supplier incentives, and whether Chinese makers can close the gap are all explored in lively, technical detail.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Inference Drives Large Memory Needs Too

  • Memory demand comes from both training and inference, with HBM critical for both workloads.
  • Long context windows and decoding-heavy inference are especially memory-bound and amplify demand.
INSIGHT

Long Contexts Multiply Memory Needs

  • AI models need large memory because longer context windows and heavy token processing keep more data in active memory.
  • Rising token consumption and longer responses multiply memory requirements per user.
INSIGHT

Consumer Devices Feel The Pinch First

  • Memory price spikes already cause visible demand destruction in PCs, smartphones, and camera components.
  • Companies may delay launches or accept margin hits rather than ship uncompetitive downgraded products.
Get the Snipd Podcast app to discover more snips from this episode
Get the app