Home
This Title All WIREs
WIREs RSS Feed
How to cite this WIREs title:
WIREs Comp Stat

Statistical learning theory: a tutorial

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Abstract In this article, we provide a tutorial overview of some aspects of statistical learning theory, which also goes by other names such as statistical pattern recognition, nonparametric classification and estimation, and supervised learning. We focus on the problem of two‐class pattern classification for various reasons. This problem is rich enough to capture many of the interesting aspects that are present in the cases of more than two classes and in the problem of estimation, and many of the results can be extended to these cases. Focusing on two‐class pattern classification simplifies our discussion, and yet it is directly applicable to a wide range of practical settings. We begin with a description of the two‐class pattern recognition problem. We then discuss various classical and state‐of‐the‐art approaches to this problem, with a focus on fundamental formulations, algorithms, and theoretical results. In particular, we describe nearest neighbor methods, kernel methods, multilayer perceptrons, Vapnik–Chervonenkis theory, support vector machines, and boosting. WIREs Comp Stat 2011 3 543–556 DOI: 10.1002/wics.179 This article is categorized under: Statistical Learning and Exploratory Methods of the Data Sciences > Clustering and Classification Statistical and Graphical Methods of Data Analysis > Nonparametric Methods Statistical Learning and Exploratory Methods of the Data Sciences > Pattern Recognition Statistical Learning and Exploratory Methods of the Data Sciences > Knowledge Discovery Statistical Learning and Exploratory Methods of the Data Sciences > Support Vector Machines Statistical Learning and Exploratory Methods of the Data Sciences > Neural Networks

Voronoi regions.

[ Normal View | Magnified View ]

Large margin separation.

[ Normal View | Magnified View ]

Separating hyperlanes in two dimensions.

[ Normal View | Magnified View ]

Threshold functions versus sigmoid functions.

[ Normal View | Magnified View ]

Hyperplane decision rules. In R2 these are just straight lines.

[ Normal View | Magnified View ]

Perceptron.

[ Normal View | Magnified View ]

Feed forward network.

[ Normal View | Magnified View ]

Epanechnikov kernel.

[ Normal View | Magnified View ]

Cauchy kernel.

[ Normal View | Magnified View ]

Gaussian kernel.

[ Normal View | Magnified View ]

Triangular kernel.

[ Normal View | Magnified View ]

Basic window kernel.

[ Normal View | Magnified View ]

Browse by Topic

Statistical Learning and Exploratory Methods of the Data Sciences > Neural Networks
Statistical Learning and Exploratory Methods of the Data Sciences > Support Vector Machines
Statistical Learning and Exploratory Methods of the Data Sciences > Knowledge Discovery
Statistical Learning and Exploratory Methods of the Data Sciences > Pattern Recognition
Statistical and Graphical Methods of Data Analysis > Nonparametric Methods
Statistical Learning and Exploratory Methods of the Data Sciences > Clustering and Classification

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts