Machine Learning Street Talk (MLST)

He Co-Invented the Transformer. Now: Continuous Thought Machines - Llion Jones and Luke Darlow [Sakana AI]

387 snips
Nov 23, 2025
In this engaging discussion, Llion Jones, co-founder of Sakana AI and co-author of the Transformer architecture, shares insights on the need for innovation beyond Transformers in AI research. Joined by Luke Darlow, a specialist in biologically inspired models, they explore the limitations of current AI paradigms and introduce the Continuous Thought Machine (CTM). This novel model emphasizes internal reasoning and adaptive computation, aiming to enhance how AI processes information. Expect fascinating analogies and thought-provoking concepts that challenge the status quo!
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ANECDOTE

Building CTM From Biological Hunches

  • Sakana built CTM from a simple biologically inspired idea about neuron synchronization and took time to polish experiments.
  • Llion Jones framed CTM as a poster child for risk-taking research that paid off at NeurIPS.
INSIGHT

CTM's Core Architectural Ideas

  • Continuous Thought Machines (CTM) add an internal sequential 'thought' dimension and synchronization-based representations.
  • CTM treats neurons as small models and measures pairwise synchronization to represent thoughts over time.
INSIGHT

Maze Task Reveals Sequential Thinking

  • Solving mazes by predicting full paths in one shot is easy for nets but unlike human sequential solving.
  • CTM enforces stepwise thinking, making the task harder but more human-like and revealing different algorithms.
Get the Snipd Podcast app to discover more snips from this episode
Get the app