This Title All WIREs
How to cite this WIREs title:
WIREs Comp Stat

Use of majority votes in statistical learning

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Today, algorithms such as the gradient boosting machine and the random forest are among the most competitive tools in prediction contests. We review how these algorithms came about. The basic underlying idea is to aggregate predictions from a diverse collection of models. We also explore a few very diverse directions in which the basic idea has evolved, and clarify some common misconceptions that grew as the idea steadily gained its popularity. WIREs Comput Stat 2015, 7:357–371. doi: 10.1002/wics.1362 This article is categorized under: Statistical Learning and Exploratory Methods of the Data Sciences > Clustering and Classification
The limiting quantity, Δ(∞), as a function of (p, q) for π = 0.25 (left), π = 0.5 (middle), and π = 0.75 (right).
[ Normal View | Magnified View ]
The phase diagram showing whether Δ(∞) is positive (+, red), negative (−, green), or zero (0, blue), in the space of (p, q), assuming that π = 1/2. Notice that abrupt phase transitions occur at p = 1/2 and q = 1/2.
[ Normal View | Magnified View ]

Browse by Topic

Statistical Learning and Exploratory Methods of the Data Sciences > Clustering and Classification

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts