This Title All WIREs
How to cite this WIREs title:
WIREs Comp Stat

Neural networks

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Abstract A variety of artificial neural networks are reviewed, including feed‐forward networks, recurrent networks, associative memories such as the Hopfield network, and the self‐organizing map. Their architectures are described, as are methods that have been developed for training them. Particular emphasis is placed on links with statistical activities, especially regression, classification, and clustering, in contexts such as graphical models and latent structure models. In terms of the training procedures, attention is drawn to the implicit or explicit implementation of statistical methodological approaches such as likelihood and Bayesian methods. Copyright © 2009 John Wiley & Sons, Inc. This article is categorized under: Statistical Learning and Exploratory Methods of the Data Sciences > Neural Networks

Simple perceptron.

[ Normal View | Magnified View ]

Architecture for self‐organizing map.

[ Normal View | Magnified View ]

Hopfield network.

[ Normal View | Magnified View ]

Feed‐forward network with one hidden layer.

[ Normal View | Magnified View ]

Browse by Topic

Statistical Learning and Exploratory Methods of the Data Sciences > Neural Networks

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts