
Learning Bayesian Statistics #155 Probabilistic Programming for the Real World, with Andreas Munk
Apr 8, 2026
Andreas Munk, researcher and entrepreneur in probabilistic programming who co-founded Evara and helped build PyProb. He discusses bridging deep learning with probabilistic programming. He explains inference compilation and amortized inference. He describes probabilistic surrogate networks for costly simulators. He demos embedding Bayesian workflows into Excel for practical decision-making.
AI Snips
Chapters
Books
Transcript
Episode notes
Neural Nets Make Bayesian Inference Practical
- Deep learning supplies expressive proposal distributions while probabilistic programming keeps uncertainty explicit for decision-making.
- Andreas explains amortized inference trains neural nets as proposal engines across many observations so inference is fast when the same simulator is reused.
Train Proposals On Simulator Traces
- Use inference compilation to amortize inference by training neural proposal networks that run alongside your probabilistic program.
- Train proposals on simulated traces so you can reuse them for repeated inference tasks instead of re-running expensive inference each time.
Curing Simulator Drove Amortized Inference Use
- Andreas used a composite-material curing simulator where internal temperatures are latent and only surface temps are observed, creating repeated inference problems.
- Because the simulator is fixed, training an amortized inference network let them reuse proposals and speed up inference dramatically.







