Eye On A.I.

#326 Zuzanna Stamirowska: Inside Pathway's Post-Transformer Architecture Designed for Memory and On-the-Fly Learning

14 snips
Mar 11, 2026
Zuzanna Stamirowska, co-founder and CEO at Pathway and complexity scientist, discusses brain-inspired BDH architecture for memory-enabled AI. She explains why current transformers reset, how sparse graph-based memories and Hebbian-like updates enable real-time, continuously updating models. They cover scaling, resilience, interpretability, productization with NVIDIA/AWS, and use cases for changing-data and regulated industries.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Transformers Reset Every Interaction Without Memory

  • Transformers lack persistent memory and reset like a 'Groundhog Day' intern every interaction.
  • Zuzanna argues persistent memory is essential for longer, coherent reasoning and handling open, context-dependent tasks with changing inputs.
INSIGHT

Learning Happens Through Local Message Passing

  • Pathway's BDH uses a graph of neurons with local message passing rather than dense all-to-all attention.
  • Messages travel to neighbors and synapses strengthen with use, creating efficient shortcuts and sparse activation.
INSIGHT

Concepts Live On Fast Weights On The Edges

  • BDH separates fast-weight state on edges from slower parameters, treating neurons as computations and synapses as memory.
  • Zuzanna links this operator/state duality to quantum-physics-inspired intuition about where representations reside.
Get the Snipd Podcast app to discover more snips from this episode
Get the app