
Learning Bayesian Statistics BITESIZE | How Do Diffusion Models Work?
5 snips
Feb 19, 2026 Jonas Arruda, a researcher who explains diffusion models and generative modeling, gives a clear mini-tutorial. He walks through starting from Gaussian noise and iteratively denoising to produce samples. He contrasts the forward noising process with the learned backward denoising. He also outlines training with noisy parameters and the role of noise schedules like alpha and sigma.
AI Snips
Chapters
Transcript
Episode notes
Sampling By Iterative Denoising
- Diffusion models generate samples by iteratively denoising random noise into the target distribution.
- They start from an easy-to-sample distribution (e.g., Gaussian) and remove noise step-by-step until the desired posterior emerges.
Forward And Backward Processes
- Diffusion involves two linked processes: a forward noising process and a backward denoising process.
- The forward maps the target distribution to noise and the backward learns to reverse that corruption to generate samples.
Train By Adding Controlled Noise
- During training, add controlled noise to known clean parameters to learn the corruption path.
- Use this forward noising procedure so the model can learn the reverse denoising for inference.
