
Eye On A.I. #326 Zuzanna Stamirowska: Inside Pathway's Post-Transformer Architecture Designed for Memory and On-the-Fly Learning
14 snips
Mar 11, 2026 Zuzanna Stamirowska, co-founder and CEO at Pathway and complexity scientist, discusses brain-inspired BDH architecture for memory-enabled AI. She explains why current transformers reset, how sparse graph-based memories and Hebbian-like updates enable real-time, continuously updating models. They cover scaling, resilience, interpretability, productization with NVIDIA/AWS, and use cases for changing-data and regulated industries.
AI Snips
Chapters
Transcript
Episode notes
Transformers Reset Every Interaction Without Memory
- Transformers lack persistent memory and reset like a 'Groundhog Day' intern every interaction.
- Zuzanna argues persistent memory is essential for longer, coherent reasoning and handling open, context-dependent tasks with changing inputs.
Learning Happens Through Local Message Passing
- Pathway's BDH uses a graph of neurons with local message passing rather than dense all-to-all attention.
- Messages travel to neighbors and synapses strengthen with use, creating efficient shortcuts and sparse activation.
Concepts Live On Fast Weights On The Edges
- BDH separates fast-weight state on edges from slower parameters, treating neurons as computations and synapses as memory.
- Zuzanna links this operator/state duality to quantum-physics-inspired intuition about where representations reside.

