A review is given of shrinkage and penalization as tools to improve predictive accuracy of regression models. The James‐Stein estimator is taken as starting point. Procedures covered are Pre‐test Estimation, the Ridge Regression of Hoerl and Kennard, the Shrinkage Estimators of Copas and Van Houwelingen and Le Cessie, the LASSO of Tibshirani and the Garotte of Breiman. An attempt is made to place all these procedures in a unifying framework of semi‐Bayesian methodology. Applications are briefly mentioned, but not amply discussed.