
Open Source Self-Driving with Comma AI
Practical AI
00:00
On-Device Inference and Compute Constraints
Harald explains the split between data center training and on-device inference, then discusses thermal limits, small chips, and external GPU plans.
Play episode from 24:28
Transcript


