This is one in a series of videos produced by Professor Seth J. Chandler at the University of Houston Law Center for the course Analytic Methods for Lawyers.
This video covers various "ensemble methods" in supervised machine learning: bagging, boosting and stacking. They all rest of the notion that, frequently, having more than one model to make a prediction or classification works better than having a single model. Think jury instead of a judge. This essay seeks to explain how each of these ensemble methods works. The focus is on understanding them at a conceptual level, not developing state of the art implementations. And a warning: boosting is challenging. Stacking isn't so easy either.
#Bagging
#Boosting
#Stacking
#Random Forest
#Decision Tree
#Regression Tree
#Learning from mistakes
#Gradient Boosting
#Adaboost
コメント