
Revenue Builders How AI Is Rewriting the Sales Playbook and Raising the Bar on Human Performance with Alex Varel
41 snips
Apr 30, 2026 Alex Varel, EVP of Worldwide Sales at Cerebras Systems, leads global go-to-market for AI infrastructure. He talks about why inference speed and memory bandwidth now drive competitive advantage. He explains how legacy sales playbooks break down and why sellers must master agentic AI, shorten tech stacks, and constantly recalibrate ICPs to orchestrate value across complex buying groups.
AI Snips
Chapters
Books
Transcript
Episode notes
Joining Cerebras After Seeing The Inference Pivot
- Alex Varel joined Cerebras after seeing AI shift from training to inference and met founders who built the world's largest chip to solve memory-bandwidth bottlenecks.
- He showed a wafer-sized chip and explained co-locating memory and compute eliminates the memory wall, enabling much faster inference.
Inference Is The New Competitive Bottleneck
- Inference, not training, is now the strategic bottleneck because generating each token requires massive memory bandwidth and frequent data movement.
- Alex quantified that generating one token on a 70B model moves ~140GB of data, making latency and bandwidth the primary differentiators for AI products.
Speed Is The Only Currency For AI UX
- User expectations demand human-clock-speed AI responsiveness; slow inference has no market, similar to slow search.
- Alex and Cerebras aim for sub-500ms (often ~200ms) responsiveness to enable richer agentic and conversational experiences.













