Learning Bayesian Statistics

#144 Why is Bayesian Deep Learning so Powerful, with Maurizio Filippone

30 snips
Oct 30, 2025
Maurizio Filippone, an associate professor at KAUST and leader of the Bayesian Deep Learning Group, dives into the fascinating world of Bayesian function estimation. He explains why Gaussian Processes are still crucial for function estimation and how deep Gaussian Processes introduce flexibility for complex tasks. Maurizio discusses practical strategies like Monte Carlo Dropout for uncertainty quantification in neural networks, the trade-offs between model complexity and interpretability, and the role of Bayesian methods in modern generative models.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ANECDOTE

GPflow's Origin Story

  • Maurizio recalls GPflow's origins tied to James Hensman building a TensorFlow GP library.
  • He praises GPflow's API and reproducible examples for taking many projects far.
INSIGHT

Reparametrization Unlocks GP MCMC

  • MCMC for GPs struggles because hyperparameters tightly couple with latent functions, slowing mixing.
  • Reparameterizations (e.g., decoupling via Cholesky and auxiliary variables) and pseudo-marginal methods speed convergence for hyperparameters.
ANECDOTE

Impressing O'Hagan With Live Code

  • Maurizio reran code at lunch to recreate a figure for Professor O'Hagan and impressed him with the speed.
  • He uses this story to highlight how efficient implementations made deep GP experiments practical even years ago.
Get the Snipd Podcast app to discover more snips from this episode
Get the app