Quantitude

S4E02 Underachievers, Overachievers, & Maximum Likelihood Estimation

Sep 20, 2022
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Flip The Probability Question

  • Maximum likelihood picks parameter values that make the observed data most probable rather than asking how probable data are given fixed parameters.
  • Seeing bus #27 implies the MLE for total buses is 27 because that maximizes the likelihood of observing 27.
INSIGHT

Sample Mean As An MLE

  • For a normal model with known variance, maximizing the likelihood over the mean yields the sample mean.
  • You can derive that by maximizing the product of individual likelihoods (or equivalently the log-likelihood) using calculus.
INSIGHT

Log Likelihood Makes Optimization Practical

  • Multiply likelihoods for independent observations but use the natural log to convert products into sums and avoid tiny numbers.
  • Maximizing log-likelihood (or minimizing -2 log-likelihood) is algebraically equivalent and computationally easier.
Get the Snipd Podcast app to discover more snips from this episode
Get the app