
[38] Andrew Lampinen - A Computational Framework for Learning and Transforming Task Representations
Jan 8, 2022
Andrew Lampinen, a research scientist at DeepMind with a PhD from Stanford, discusses the fascinating intersection of cognitive flexibility and machine learning. He delves into metamapping, exploring how it improves task adaptability and zero-shot generalization. The conversation extends to the balance between human and artificial learning, emphasizing the importance of contextual understanding in AI symbolism. Lampinen also shares insights on transitioning from academia to industry and the significance of balancing personal life with research commitments.
AI Snips
Chapters
Transcript
Episode notes
Curricula Are Key to Learning
- Human culture crafts curricula over millennia to develop complex skills stepwise.
- AI may need similarly structured curricula to acquire complex knowledge systematically.
Complementary Learning Systems Explained
- Complementary learning systems have slow cortical learning and fast episodic hippocampal memory systems.
- These systems interplay to enable rapid adaptation supported by slow knowledge integration over life.
Metamapping Accelerates Task Adaptation
- Metamapping teaches models how to adapt to new tasks quickly by leveraging learned task relationships.
- Understanding higher-order task structures aids zero-shot task generalization across domains.

