This Title All WIREs
How to cite this WIREs title:
WIREs Comp Stat

Shrinkage and absolute penalty estimation in linear regression models

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Abstract In predicting a response variable using multiple linear regression model, several candidate models may be available which are subsets of the full model. Shrinkage estimators borrow information from the full model and provides a hybrid estimate of the regression parameters by shrinking the full model estimates toward the candidate submodel. The process introduces bias in the estimation but reduces the overall prediction error that offsets the bias. In this article, we give an overview of shrinkage estimators and their asymptotic properties. A real data example is given and a Monte Carlo simulation study is carried out to evaluate the performance of shrinkage estimators compared to the absolute penalty estimators such as least absolute shrinkage and selection operator (LASSO), adaptive LASSO and smoothly clipped absolute deviation (SCAD) based on prediction errors criterion in a multiple linear regression setup. WIREs Comput Stat 2012, 4:541–553. DOI: 10.1002/wics.1232 This article is categorized under: Statistical Learning and Exploratory Methods of the Data Sciences > Modeling Methods Statistical Models > Linear Models

Flowchart of shrinkage estimation in a multiple regression.

[ Normal View | Magnified View ]

Graphical comparison of simulated RMSE for fixed p1 = 5, n = 110 when Δ* = 0.

[ Normal View | Magnified View ]

RMSE of RE, SE and PSE estimators for n = 50 and (p1,p2) = (6, 5), (6, 10), (9, 5), (9, 10).

[ Normal View | Magnified View ]

Browse by Topic

Statistical Models > Linear Models
Statistical Learning and Exploratory Methods of the Data Sciences > Modeling Methods

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts