A boosting technique generates successive models that are added to an overall ensemble model. With each generation step, the training data that the model has as yet been unable to classify or predict successfully is put into focus using a weighting mechanism. This increases the likelihood that the next step will learn how to process that data correctly.
Important hyperparameters with boosting techniques are the weighting size and the number of steps. The art of boosting is to find the values for these hyperparameters that lead to the best result. Too low and the advantages of boosting will not be fully realised; too high and the ensemble model will overfit the training data. The optimum value for a given use case can often be discovered by trial and error.
Gradient boosting refers to a popular mathematical means of performing the weightings, while AdaBoost is a popular overall implementation.
- alias
- subtype
- Gradient boosting AdaBoost
- has functional building block
- FBB_Classification FBB_Value prediction
- has learning style
- LST_Supervised
- has relevance
- REL_Relevant
- mathematically similar to
- typically supports