
The Next Wave - AI and The Future of Technology NotebookLM Just Killed After Effects? + ChatGPT vs Anthropic War
36 snips
Mar 10, 2026 They explore NotebookLM’s new cinematic features and whether it can replace After Effects. A rapid round-up of the latest LLM releases and what they mean for creators. The surprise surge of Claude after the Pentagon controversy and why users are switching. Hollywood’s growing use of AI in filmmaking and the production trade-offs it creates.
AI Snips
Chapters
Transcript
Episode notes
Why Many Model Releases Are Tweaks Not Retrains
- Model updates usually skip costly pre-training and instead use fine-tuning or RLHF to change behavior and tone.
- Matt Wolfe explains GPT 5.3 reduced preamble by re-running reinforcement learning to make answers more direct rather than retraining from scratch.
Distilled Models Deliver Similar Outputs Cheaper
- Distillation creates smaller, cheaper models by training on prompt-response pairs and chain-of-thought from a larger model.
- Matt Wolfe notes flash/light models are often distilled versions that mimic larger models' outputs at lower cost.
Use Million Token Context For Big Coding Sessions
- Use models with large context windows for complex coding or long-horizon agent tasks to avoid lossy compaction of prior conversation.
- Matt Wolfe highlights GPT 5.4's 1 million token context as valuable for reading full codebases and preserving conversation state.
