AI Snips
Chapters
Transcript
Episode notes
Flip The Probability Question
- Maximum likelihood picks parameter values that make the observed data most probable rather than asking how probable data are given fixed parameters.
- Seeing bus #27 implies the MLE for total buses is 27 because that maximizes the likelihood of observing 27.
Sample Mean As An MLE
- For a normal model with known variance, maximizing the likelihood over the mean yields the sample mean.
- You can derive that by maximizing the product of individual likelihoods (or equivalently the log-likelihood) using calculus.
Log Likelihood Makes Optimization Practical
- Multiply likelihoods for independent observations but use the natural log to convert products into sums and avoid tiny numbers.
- Maximizing log-likelihood (or minimizing -2 log-likelihood) is algebraically equivalent and computationally easier.


