
Why Social Media Lost in Court and AI Agents Demand Total Surveillance – Shelley Palmer's 5th Visit
The AI XR Podcast
Apple Secure Enclave as a Trust Play
Shelley suggests Apple's on-device secure enclave could win trust by keeping PII local while enabling on-device models.
Shelley Palmer,media technologist, advisor, and author with over 700,000 daily newsletter subscribers, returns to the show. He's one of the sharpest thinkers writing about AI today, and this conversation covers the full arc: from social media liability to the trust collapse coming for all of us, and into the real productivity gains and surveillance trade-offs of living inside an AI-first workflow.
The episode opens with the Google and Meta lawsuit verdict and quickly moves past the legal question. Shelley's position is precise: you can't legislate parenting, but you can legislate transparency, and the tech industry has failed on that front entirely. The $6 million judgment against Meta and Google is a rounding error — not a deterrent. What matters is what platforms actually engineered: engagement above all else, backed by neuroscience, probabilistic math, and dopamine feedback loops optimized for shareholders, not users.
AI XR News You Should Know: OpenAI is ending Sora and pivoting hard to Codex and enterprise. Ben Affleck secured $900 million from Netflix for a custom AI filmmaking tool. Epic Games cut 1,000 jobs as Fortnite loses audience. NVIDIA's Jensen Huang introduced Nemo Claw and Open Shell at GTC — a corporatized framework for personal AI agents.
Key Moments
- [00:01:15] – Charlie opens noting the show missed one episode in nearly 300 — his daughter's wedding
- [00:01:55] – OpenAI kills Sora; the Critters director goes dark before the episode
- [00:04:45] – Google and Meta lose their social media addiction lawsuit; Meta also loses in New Mexico
- [00:08:07] – Shelley on what can actually be legislated: not parenting, but transparency
- [00:11:42] – Shelley on Zuckerberg: he genuinely believed connection would be net positive; ask him today
- [00:13:31] – "Planetarily net negative. No matter what good it does, it does more harm."
- [00:18:16] – Rony on dopamine engineering: neuroscientists studying pixel size, color, sound to refine addiction
- [00:19:40] – Shelley reframes it: engagement maximization for shareholders, no more insidious than that
- [00:23:19] – The physiological change argument: humans evolved to default to trust; AI-generated everything breaks that
- [00:31:50] – Rony's counterpoint: trust will reset local; the software ecosystem will follow
- [00:36:53] – Shelley: "Our business increased last year. Everyone on my staff is doing 400 times the work."
- [00:44:42] – AI-first means automating every workflow you can honestly automate — and knowing what isn't ready
- [00:45:06] – Jensen's Nemo Claw and Open Shell: the safer path to personal AI agents, and what it actually costs
- [00:49:42] – The surveillance trade-off: an effective AI agent requires more personal data exposure than anything before it
- [00:51:24] – Apple's Secure Enclave play: why Tim Cook may win the AI trust war in the end
The productivity gains are real, but so is the privacy exposure, and the systems that earn trust — at every level — are the ones that will survive.
This episode is brought to you by Zappar, the company behind Mattercraft — the leading visual development environment for building immersive 3D web experiences across mobile, headsets, and desktop. Mattercraft now features an AI assistant that helps you design, code, and debug in real time, right in your browser.
Start building at mattercraft.io. Subscribe to the AI XR Podcast wherever you listen.
Watch the full episode for the full breakdown. Available where podcasts are. Full videos available on YouTube. https://youtu.be/S_AECjELYyo
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.


