Power Law with John Coogan

Interview with Gavin Uberti & Robert Wachen (Etched)

56 snips
Jun 25, 2024
AI experts Gavin Uberti & Robert Wachen discuss advancements in AI models, custom chip design impact on job market, and scaling challenges in building data centers. They explore the future of GPU technology, government regulations, and the evolution of hardware/software in AI model training.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Upsample Scarce Modalities To Teach Rare Skills

  • You can upsample scarce modalities (e.g., robotics) during training instead of needing enormous native datasets.
  • Repeating or oversampling valuable data points helps models learn rare but critical behaviors.
INSIGHT

Open Source Spurs Ecosystem Efficiency

  • Open-source releases create economic value by driving optimization, ecosystem tools, and alternative deployment paths.
  • Releasing models spurs kernel-level improvements that lower long-term inference costs.
ADVICE

Go Low-Level For Hyperscale Training

  • Move below PyTorch when training huge models: implement custom transformer kernels and operator libraries for better utilization.
  • Expect to replace high-level frameworks with specialized kernels at hyperscale training.
Get the Snipd Podcast app to discover more snips from this episode
Get the app