AI Snips
Chapters
Transcript
Episode notes
FFT Enables Efficient Long Convolutions
- FFT lets SSMs compute very long convolutions in near-linear time. That algorithmic property explains why SSMs scale much better than attention for long inputs.
Audio Shows SSM Strength
- Audio waveforms are a clear example where transformers choke due to sequence length (e.g., 64k samples/sec). SSMs can process such sequences and model waveform structure effectively.
Why 'Hungry Hungry Hippos'?
- The 'Hippo' name came from 'hippocampus' and earlier lab work on memory mechanisms. 'Hungry Hungry Hippos' playfully signals the dual-SSM design.


