Stepwise Regression

Algorithm

Stepwise regression is a method used to determine which predictor variables should be included in a regression model.

It is applied when there is uncertainty about the predictive power of predictor variables.

The method works by iteratively adding or removing variables based on specific criteria and observing the effect on model accuracy.

For example, in a dataset with multiple predictors, stepwise regression might start with no variables in the model, add the most significant variable, and continue adding variables until no significant improvement is observed.

Stepwise regression is known for producing unstable results and overfitting the training data, making it less reliable for dimensionality reduction.

However, it is important to understand this method as it lays the groundwork for more advanced techniques like least angle regression (LARS).

Alias
Stepwise Selection Stepwise Model Selection
Related terms
Dimensionality Reduction Least Angle Regression