This Title All WIREs
How to cite this WIREs title:
WIREs Comp Stat

Variable importance in regression models

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Abstract Regression analysis is one of the most‐used statistical methods. Often part of the research question is the identification of the most important regressors or an importance ranking of the regressors. Most regression models are not specifically suited for answering the variable importance question, so that many different proposals have been made. This article reviews in detail the various variable importance metrics for the linear model, particularly emphasizing variance decomposition metrics. All linear model metrics are illustrated by an example analysis. For nonlinear parametric models, several principles from linear models have been adapted, and machine‐learning methods have their own set of variable importance methods. These are also briefly covered. Although there are many variable importance metrics, there is still no convincing theoretical basis for them, and they all have a heuristic touch. Nevertheless, some metrics are considered useful for a crude assessment in the absence of a good subject matter theory. WIREs Comput Stat 2015, 7:137–152. doi: 10.1002/wics.1346 This article is categorized under: Statistical and Graphical Methods of Data Analysis > Multivariate Analysis
Visualization of the normalized metrics from Table . The metrics are ordered by increasing allocation to Examination. Blue stands for metrics that fulfill the Exclusion criterion, yellow/orange stands for the more marginally‐inclined metrics. Green stands for metrics in‐between or un‐typified, grey for the sequential allocations.
[ Normal View | Magnified View ]

Browse by Topic

Statistical and Graphical Methods of Data Analysis > Multivariate Analysis

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts