
Beyond Bigger Models: Recursion As The Next Scaling Law In AI
Y Combinator Startup Podcast
00:00
Why Transformers Hit Reasoning Limits
Francois Chaubard contrasts RNNs and transformers, explaining how training efficiency in LLMs sacrifices latent recurrence, memory compression, and some algorithmic reasoning ability.
Play episode from 00:43
Transcript


