
Adventures in Machine Learning Hyperparameter Tuning for Machine Learning Models - ML 079
8 snips
Jul 7, 2022 AI Snips
Chapters
Transcript
Episode notes
Tuning for Generalization
- Tuning hyperparameters like maximum depth and minimum samples per leaf lets you control model sensitivity to rare events.
- Analyze your data during EDA to understand feature interactions and potential split conditions.
Key Random Forest Hyperparameters
- When tuning Random Forests, prioritize the number of trees and max depth.
- These parameters greatly influence model stability and the bias-variance trade-off.
Overfitting Example
- Setting max depth to row count with min samples per leaf at one forces overfitting, creating a massive, impractical tree.
- Visualizing this overfit tree reveals how it touches almost every feature multiple times, hindering generalization.
