
The Stack Overflow Podcast AI attention span so good it shouldn’t be legal
30 snips
Feb 6, 2026 Zuzanna Stamirowska, CEO of Pathway (building post-transformer, memory-first AI); Victor Szczerba, CCO at Pathway (enterprise product and observability lead); Rowan McNamee, Co-founder and COO of Mary Technology (legal fact-management for litigators). They discuss memory-inspired models and long attention spans, continual learning and efficient architecture, and using AI to extract, organize, and verify legal facts.
AI Snips
Chapters
Transcript
Episode notes
Sparse Positive Representations
- Pathway uses sparse, positive-only activations and represents synapses as sparse structures rather than dense matrices.
- They adapt math tricks to run efficiently on GPUs while preserving sparsity and positive vector geometry.
Model State Equals Memory
- The model treats its synapses as state kept in memory, enabling fast-weight-like plasticity on-chip.
- This makes continual updates efficient and separates fast synaptic state from slower long-term memories.
Longer Attention Lowers Hallucinations
- Built-in memory and time-aware reasoning reduce hallucinations by keeping models focused longer on tasks.
- Longer attention spans improve success on multi-hour or multi-step tasks compared with current transformer context limits.

