The Generalist

Everyone Is Betting on Bigger LLMs. She's Betting They're Fundamentally Wrong. (Eve Bodnia, Founder & CEO of Logical Intelligence)

14 snips
Feb 24, 2026
Eve Bodnia, founder and CEO of Logical Intelligence and former theoretical physicist, builds energy-based reasoning models like Kona. She discusses why LLMs may only pattern-match, how EBMs reason in latent space, a $4 vs $15,000 cost comparison, surprising extrapolation at 16M parameters, and why mission-critical systems need rule-based reasoning rather than probabilistic guessing.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ANECDOTE

Meeting Perelman Shaped Views On Ownership

  • Eve met Grigori Perelman as a teenager while interning in St. Petersburg and asked him why he declined the Fields Medal.
  • Perelman replied that many people were behind the result and he merely put pieces together, shaping Eve's view on ownership in science.
INSIGHT

Language Is A Lossy Map Of Thought

  • Language is a lossy mapping from rich internal mental representations to words, described via the manifold hypothesis.
  • Eve explains thoughts live in mixed latent formats (images, words, symbols) so translating into language loses information and constrains reasoning.
INSIGHT

Symmetry Connects Physics Brain And AI

  • Symmetry and invariance unify physics, condensed matter, neuroscience, and AI because the same math describes order emerging from disorder.
  • Eve used symmetry groups to move knowledge between particle physics, materials, and brain models when seeking invariant patterns.
Get the Snipd Podcast app to discover more snips from this episode
Get the app