
The Daily AI Show From Pokémon Go to Open Jarvis
12 snips
Mar 16, 2026 They discuss Pokémon Go as unpaid spatial AI data collection and why map companies avoided gamification. Conversation shifts to NVIDIA GTC, the industry pivot from training to local inference, and hardware planning for agentic AI. Stanford Open Jarvis and fully on-device personal agents get attention. They explore Claude Code on phones, million-token contexts, Google’s multimodal embeddings, and geolocation AR app ideas.
AI Snips
Chapters
Transcript
Episode notes
Inference Is The New GPU Growth Driver
- NVIDIA's future focus is shifting from massive training clusters to inference for real-time, agentic AI.
- Andy explains demand is growing for low-latency local inference on devices rather than only large training farms.
Chips Are Shrinking Toward Local Agentic AI
- Chipmakers shrink process nodes and integrate CPU+GPU cores to support on-device agentic computing.
- Andy and Brian discuss nanometer advances and Apple M-series as examples of stacking CPU and GPU for local AI workloads.
OpenJarvis Enables Fully On‑Device Personal Agents
- Stanford's OpenJarvis pushes for local-first personal agents that run entirely on-device to reduce latency, cost, and data exposure.
- Andy summarizes OpenJarvis as research plus deployment-ready infrastructure for local execution with optional cloud use.
