The MAD Podcast with Matt Turck

Everything Gets Rebuilt: The New AI Agent Stack | Harrison Chase, LangChain

332 snips
Mar 12, 2026
Harrison Chase, co-founder and CEO of LangChain, a leader in agent tooling and infrastructure. He walks through why the AI stack is being rebuilt: harnesses that manage tools, subagents and files, planning and context compaction, memory types, sandboxes for secure code execution, and observability to run stateful agents reliably. Short, technical and future-focused conversation on the new primitives reshaping autonomous AI.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Four Types Of Agent Memory And How To Store Them

  • Memory splits into short-term (thread), semantic (RAG-like facts), episodic (past conversations), and procedural (instructions/skills).
  • In Deep Agents procedural memory is files the agent can update, enabling the agent to 'learn' by editing its own instructions.
INSIGHT

Differentiation Comes From Instructions And Tools Not Agent Count

  • Whether to build one mega-agent or many subagents depends on use case; the enduring assets are precise instructions, tools, and skills.
  • Chase advises enterprises to focus on building instructions and tools because those are portable across architectures.
INSIGHT

Invest In Observability Sandboxes And Evals

  • Stable, investable infra includes observability, evals, sandboxes, and long-running stateful deployments rather than high-level harness formats.
  • LangChain focuses on low-level primitives so products remain useful as higher-level scaffolding evolves.
Get the Snipd Podcast app to discover more snips from this episode
Get the app