The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch

20VC: Why Google Will Win the AI Arms Race & OpenAI Will Not | NVIDIA vs AMD: Who Wins and Why | The Future of Inference vs Training | The Economics of Compute & Why To Win You Must Have Product, Data & Compute with Steeve Morin @ ZML

462 snips
Feb 24, 2025
Steeve Morin, founder and CEO of ZML, discusses the competitive landscape of AI, emphasizing why Google is positioned to win the AI arms race while OpenAI might falter. He explores the evolving role of inference over training and the challenges facing AI hardware supply. The importance of data, product, and compute is highlighted as essential for success in AI. Morin also analyzes NVIDIA's market dominance amid competition from AMD and shares insights on semiconductor supply chain issues affecting chip production.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

GPUs Not Built for AI

  • GPUs, originally designed for graphics processing, were adapted for AI through GPGPU.
  • While effective, GPUs aren’t specifically built for AI, causing limitations in memory transfer and efficiency.
INSIGHT

Chip Market Categories

  • The chip market can be divided into three categories: rentable/purchasable GPUs, rentable TPUs, and purchasable dedicated chips.
  • Google Cloud users face a "thin crust on a very big cake" scenario due to NVIDIA's high margins on TSMC chips.
INSIGHT

Training vs. Inference

  • Training prioritizes more resources and faster iteration, while inference prioritizes reliability and minimal resource usage.
  • A key difference between training and inference is the need for interconnect, crucial for training but less so for inference.
Get the Snipd Podcast app to discover more snips from this episode
Get the app