Machine Learning: How Did We Get Here?

A University and Corporate Perspective with Yann LeCun

Mar 2, 2026
Yann LeCun, NYU professor and Turing Award winner known for convolutional nets and self-supervised learning. He traces neural-net history from early perceptrons and inspiration from vision neuroscience to commercial wins and the ImageNet revolution. He discusses PyTorch/autodiff, the rise of self-supervision and Transformers, and his world-model and JEPA ideas for learning predictive representations.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Deep Learning Was A Rebranding Strategy

  • Yann and colleagues rebranded neural nets as deep learning to escape stigma and broaden interest, organizing a 2007 pirate NIPS workshop funded by CIFAR.
  • That workshop rebuilt a community so papers began to be reviewed by knowledgeable peers, kickstarting the revival.
INSIGHT

ImageNet Win Changed Everything

  • The 2012 ImageNet win by a convolutional net (Krizhevsky et al.) was the tipping point that made convolutional nets widely recognized and rapidly adopted across CV.
  • Within two years CVPR flipped from rejecting neural-net papers to requiring them.
ADVICE

Negotiate Research Terms Before Joining Industry

  • Yann accepted Meta's offer only after securing open research, remaining in New York, and keeping his NYU position, explaining his three conditions to Mark Zuckerberg.
  • He used these constraints to build FAIR as an open research organization inside industry.
Get the Snipd Podcast app to discover more snips from this episode
Get the app