Machine Learning: How Did We Get Here? cover image

The History of Machine Learning with Tom Mitchell

Machine Learning: How Did We Get Here?

00:00

Transformers and Attention Breakthrough

Tom explains the 2017 'Attention is All You Need' paper and the transformer architecture's pivotal role in modern LLMs.

Play episode from 55:16
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app