
The Data Exchange with Ben Lorica Adaptation: The Missing Layer Between Apps and Foundation Models
17 snips
Mar 5, 2026 Sudip Roy, Co-founder and CTO of Adaption Labs known for building systems that adapt foundation models to real-world enterprise needs. He discusses why projects fail in the last 5% of reliability. He explores gradient-free, inference-time techniques for routing and combining models. He compares adaptation to fine-tuning and outlines practical tradeoffs for cost, deployment, and continuous improvement.
AI Snips
Chapters
Transcript
Episode notes
Adaptation Is Broader Than Post Training
- Post-training typically implies gradient updates to model weights, but adaptation includes broader gradient-free techniques at inference time.
- Roy argues gradient-free methods offer interactive, fast behavior changes without expensive retraining.
Abstract Prompt Complexity Behind An Algorithmic Layer
- Use a model-agnostic algorithmic layer to hide model-specific prompt complexities from users.
- That layer can route, combine, or merge models to achieve objectives without users hand-tuning prompts per model.
Adaptation Works Even With Black Box Models
- You can implement many adaptation techniques without model weight access; weight access enables gradient-based strategies but isn't mandatory.
- Roy notes enterprises already use multiple models and gateways to route around black-box limitations.
