Home
This Title All WIREs
WIREs RSS Feed
How to cite this WIREs title:
WIREs Comp Stat

Scott's rule

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Abstract The optimal construction of a histogram is a fundamental task in data analysis. Many rules of thumb are available to get started. Some use the normal density as a reference distribution. Scott's rule is of that class, using as the measure of discrepancy the mean integrated squared error. This article discusses the origin and formulation of this formula, as well as comparisons with some other formulae and their relative performance. Copyright © 2010 John Wiley & Sons, Inc. This article is categorized under: Statistical and Graphical Methods of Data Analysis > Density Estimation

Histogram estimates of 1000 standard normal points with fixed bin widths of 1, 1/3, and 1/9 from left to right. The true density curve is shown in red.

[ Normal View | Magnified View ]

Plot of the ‘difficulty factor’ for normal, symmetric beta, t, and gamma densities. The factor is constant for the normal density. The other three densities converge to a normal density as the parameter increases.

[ Normal View | Magnified View ]

Decomposition of the integrated mean squared error (solid lines) into integrated variance (dashed lines) and integrated squared bias (dashed blue line) for normal data and three sample sizes n = 102, 103, 104. The optimal bin widths and corresponding optimal IMSE are shown as red dots.

[ Normal View | Magnified View ]

Related Articles

Exploratory data analysis

Browse by Topic

Statistical and Graphical Methods of Data Analysis > Density Estimation

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts