"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis

Infinite Code Context: AI Coding at Enterprise Scale w/ Blitzy CEO Brian Elliott & CTO Sid Pardeshi

343 snips
Feb 5, 2026
Sid Pardeshi, Blitzy CTO and systems engineer, and Brian Elliott, Blitzy CEO building autonomous code generation, explain their “infinite code context” approach. They discuss dynamic agent architectures and model selection, large-scale code ingestion and schematized knowledge graphs, parallel testing and runtime-backed validation, pricing and paths to near-complete autonomy, and why the last 20% still needs humans.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ADVICE

Mix Model Families For Robustness

  • Use multiple model families (OpenAI, Google, Anthropic) and have dissimilar models review each other's work.
  • Assign roles: some models for first-pass generation, others for structured review and long-horizon planning.
INSIGHT

Memory Over Fine-Tuning

  • Fine-tuning is a last-mile optimization and often loses generality as new baseline models improve.
  • Blitzy is more bullish on memory systems (application-level, persistent preferences) than on fine-tuning models.
INSIGHT

Store Enterprise Memory In The System

  • Enterprise memory must live at the application/system layer because many preferences are locally specific.
  • Store decisions, traces, and preferences in the enterprise instance instead of compressing them into model weights.
Get the Snipd Podcast app to discover more snips from this episode
Get the app