undefined

Sid Sheth

Founder and CEO of d-Matrix, an AI inference chip company challenging GPU incumbents; discusses the company's Corsair chip and inference-focused strategy.

Top 5 podcasts with Sid Sheth

Ranked by the Snipd community
undefined
24 snips
Apr 30, 2025 • 55min

#251 Sid Sheth: How d-Matrix is Disrupting AI Inference in 2025

In a captivating discussion, Sid Sheth, CEO and Co-Founder of d-Matrix, dives into how his startup is transforming AI inference. He highlights the significance of inference over training for the future of AI and how d-Matrix’s Corsair PCIe accelerator outshines NVIDIA's offerings. Sid explains the role of in-memory compute technologies, the shift towards heterogeneous AI infrastructure, and the global landscape of inference chips. With insights from his extensive semiconductor background, he reveals his vision for creating a formidable competitor to industry giants.
undefined
21 snips
Nov 14, 2025 • 51min

Former Twitter CEO Building AI Web Infrastructure, Waymo’s Freeway Expansion | Nov 14, 2025

Parag Agrawal, former CEO of Twitter and now founder of Parallel Web Systems, shares insights on building AI web infrastructure and the future evolution of AI agents. Sid Sheth, CEO of d-Matrix, discusses their innovative inference chip challenging the GPU market, while Cory Weinberg reveals the boardroom turmoil at Grindr amid a takeover offer. The conversation also touches on Waymo's freeway expansion and its implications for autonomous vehicles, emphasizing the shift in public acceptance and safety perceptions.
undefined
9 snips
Feb 12, 2026 • 46min

Breaking the Memory Wall in the Age of Inference

Sid Sheth, founder and CEO of D‑Matrix, builds memory-centric AI inference hardware optimized for low-latency reasoning. He discusses SRAM-first accelerator designs, why HBM favors training not inference, digital in-memory compute to cut data movement, and trade-offs between latency and throughput. Practical deployment, software porting, and future multimodal/agentic inference trends are also covered.
undefined
8 snips
Aug 6, 2025 • 24min

Confronting AI’s Next Big Challenge: Inference Compute

In a dynamic conversation, Sid Sheth, Founder and CEO of d-Matrix, dives into the complexities of AI inference. He emphasizes that inference isn't a one-size-fits-all challenge and requires specialized hardware for different needs. Sid introduces d-Matrix's innovative modular platform, Corsair, designed to minimize memory-compute distance for faster performance. He also explores the parallels between human learning and AI deployment, and stresses the necessity for tailored infrastructure to enhance enterprise AI integration.
undefined
7 snips
Dec 8, 2025 • 47min

366: Inside the Age of Inference: Sid Sheth, CEO and Co-Founder of d-Matrix, on Smaller Models, AI Chips, and the Future of Compute

In this discussion, Sid Sheth, CEO and co-founder of d-Matrix, shares insights from his extensive semiconductor experience. He emphasizes that AI inference represents a transformative opportunity, driving productivity unlike any previous tech shift. Sid explains how more efficient, smaller models are key to AI's scalability and why a talent shortage could impede progress. He also discusses the need for purpose-built AI chips and highlights D-Matrix's innovative approach to integrating into existing data centers.

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app