Y Combinator Startup Podcast cover image

Beyond Bigger Models: Recursion As The Next Scaling Law In AI

Y Combinator Startup Podcast

00:00

Why Transformers Hit Reasoning Limits

Francois Chaubard contrasts RNNs and transformers, explaining how training efficiency in LLMs sacrifices latent recurrence, memory compression, and some algorithmic reasoning ability.

Play episode from 00:43
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app