Practical AI cover image

Open Source Self-Driving with Comma AI

Practical AI

00:00

On-Device Inference and Compute Constraints

Harald explains the split between data center training and on-device inference, then discusses thermal limits, small chips, and external GPU plans.

Play episode from 24:28
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app