This document discusses methods for selecting variables and regularizing models in linear regression. It begins by introducing linear regression models and discussing their advantages and limitations. It then covers several approaches to improve upon ordinary least squares, including subset selection methods like best subset selection and stepwise selection, and shrinkage methods like ridge regression. Ridge regression fits a model with all variables but shrinks their coefficients towards zero to improve predictive performance. The document provides examples applying these methods to a credit scoring data set.