Interconnects cover image

How much does distillation really matter for Chinese LLMs?

Interconnects

00:00

Defining Modern Distillation

Unknown Host explains how distillation now usually means synthetic data from stronger models rather than classic knowledge distillation.

Play episode from 00:19
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app