Home
This Title All WIREs
WIREs RSS Feed
How to cite this WIREs title:
WIREs Comp Stat

Bayesian regularization: From Tikhonov to horseshoe

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Bayesian regularization is a central tool in modern‐day statistical and machine learning methods. Many applications involve high‐dimensional sparse signal recovery problems. The goal of our paper is to provide a review of the literature on penalty‐based regularization approaches, from Tikhonov (Ridge, Lasso) to horseshoe regularization. This article is categorized under: Statistical and Graphical Methods of Data Analysis > Robust Methods Statistical Models > Linear Models Statistical Models > Bayesian Models
Comparison of geometry of a unit ball induced by Normal, Laplace, Cauchy and Horseshoe priors
[ Normal View | Magnified View ]
Posterior mode for each of each of the 10 betas estimated using Laplace, normal and horseshoe Bayesian models as well as OLS estimates
[ Normal View | Magnified View ]
Comparison of Laplace (LASSO), Normal (Ridge), Cauchy and Horseshoe priors
[ Normal View | Magnified View ]
Condition number of original problem (left) and the regularized one (right)
[ Normal View | Magnified View ]

Browse by Topic

Statistical and Graphical Methods of Data Analysis > Robust Methods
Statistical Models > Linear Models
Statistical Models > Bayesian Models

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts