Home
This Title All WIREs
WIREs RSS Feed
How to cite this WIREs title:
WIREs Comp Stat

WinBUGS: a tutorial

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

The reinvention of Markov chain Monte Carlo (MCMC) methods and their implementation within the Bayesian framework in the early 1990s has established the Bayesian approach as one of the standard methods within the applied quantitative sciences. Their extensive use in complex real life problems has lead to the increased demand for a friendly and easily accessible software, which implements Bayesian models by exploiting the possibilities provided by MCMC algorithms. WinBUGS is the software that covers this increased need. It is the Windows version of BUGS (Bayesian inference using Gibbs sampling) package appeared in the mid‐1990s. It is a free and a relatively easy tool that estimates the posterior distribution of any parameter of interest in complicated Bayesian models. In this article, we present an overview of the basic features of WinBUGS, including information for the model and prior specification, the code and its compilation, and the analysis and the interpretation of the MCMC output. Some simple examples and the Bayesian implementation of the Lasso are illustrated in detail. WIREs Comp Stat 2011 3 385–396 DOI: 10.1002/wics.176

Figure 1.

Posterior summaries of the Lasso regression parameters using standardized data (λ = 0.067).

[ Normal View | Magnified View ]
Figure 2.

Posterior summaries of the Lasso regression parameters for the unstandardized data (λ = 0.067).

[ Normal View | Magnified View ]
Figure 3.

Posterior densities of the Lasso regression coefficients for the unstandardized data (λ = 0.067).

[ Normal View | Magnified View ]
Figure 4.

Posterior summaries of the Lasso regression parameters for standardized and unstandardized data (λ = 2).

[ Normal View | Magnified View ]
Figure 5.

Posterior box plots describing the 95% credible intervals of the regression coefficients using the standardized data (obtained by inserting node b in the node box of Compare tool inside the Inference menu.). (a) λ = 0.067; (b) λ = 2.

[ Normal View | Magnified View ]
Figure 6.

Posterior summaries of the Lasso regression coefficients with variable selection.

[ Normal View | Magnified View ]
Figure 7.

Posterior summaries the indicator parameters included in the Bayesian Lasso model.

[ Normal View | Magnified View ]
Figure 8.

Posterior densities for model parameters β(γ).

[ Normal View | Magnified View ]

Browse by Topic

Computational Bayesian Methods > Markov Chain Monte Carlo (MCMC)
Computational Bayesian Methods > Bayesian Methods and Theory
Computer Science Models > Software/Statistical Software
blog comments powered by Disqus

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts

Twitter: WileyCompSci Follow us on Twitter

    It's #FCF! Read a FREE chapter from the 2nd edition of Roger Cochetti's Mobile Satellite Communications Handbook. http://t.co/DorvX3O3c6