
283: Getting The Most Out of Data With Gradient Boosting
Super Data Science: ML & AI Podcast with Jon Krohn
00:00
How to Use Gradient Boosting to Improve Your Prediction Accuracy
Nicholas Hugg: gradient boosting is an iterative procedure where you basically build one tree at a time. He says it's part of the bigger family of ensemble algorithms similar to random, excuse me, like random forest. So instead of using a single tree, we use many trees, but they are each of them is quite restricted. And so we get something that is more stable and less prone to overfitting than if we just had one really big tree.
Play episode from 18:24
Transcript


