Home
This Title All WIREs
WIREs RSS Feed
How to cite this WIREs title:
WIREs Comp Stat

The Bayesian information criterion: background, derivation, and applications

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

The Bayesian information criterion (BIC) is one of the most widely known and pervasively used tools in statistical model selection. Its popularity is derived from its computational simplicity and effective performance in many modeling frameworks, including Bayesian applications where prior distributions may be elusive. The criterion was derived by Schwarz (Ann Stat 1978, 6:461–464) to serve as an asymptotic approximation to a transformation of the Bayesian posterior probability of a candidate model. This article reviews the conceptual and theoretical foundations for BIC, and also discusses its properties and applications. WIREs Comput Stat 2012, 4:199–203. doi: 10.1002/wics.199

This article is categorized under:

  • Statistical and Graphical Methods of Data Analysis > Bayesian Methods and Theory
  • Statistical and Graphical Methods of Data Analysis > Information Theoretic Methods
  • Statistical Learning and Exploratory Methods of the Data Sciences > Modeling Methods

Related Articles

Statistical Methods

Browse by Topic

Statistical Learning and Exploratory Methods of the Data Sciences > Modeling Methods
Statistical and Graphical Methods of Data Analysis > Bayesian Methods and Theory
Statistical and Graphical Methods of Data Analysis > Information Theoretic Methods

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts