DCD Zero Downtime: The Bi-Weekly Data Center Show

Bonus episode - Clouds and AI inference, with Cirrascale's CEO David Driggers

10 snips
Aug 15, 2025
Join David Driggers, CEO and CTO of Cirrascale, a trailblazer in cloud services, as he delves into future cloud operations for 2025. He discusses how Cirrascale navigates a competitive landscape dominated by hyperscalers. The conversation covers the growing AI inference market, highlighting the importance of GPU deployment and the unique challenges enterprises face in optimizing AI—balancing efficient resource management with budget constraints. David also reflects on the evolving role of AI in healthcare and programming, emphasizing the need for sustainable business models.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ADVICE

Right-Size GPUs For Cost Efficiency

  • Push models down the technology curve until they run "fast enough" to meet latency requirements.
  • Then choose the cheapest GPU that still meets that performance target.
ANECDOTE

Why Cirrascale Targets Always-On Workloads

  • Cirrascale targets production, always-on workloads where hyperscalers' elastic model is inefficient.
  • Hyperscalers suit dev/test but are costly for sustained training or production inference.
INSIGHT

Inference As A Service Model

  • True inference-as-a-service means the provider takes the model, deploys per SLOs, and charges by token usage.
  • Cirrascale claims to offer this turnkey deployment and token billing model.
Get the Snipd Podcast app to discover more snips from this episode
Get the app