Home
This Title All WIREs
WIREs RSS Feed
How to cite this WIREs title:
WIREs Comp Stat

Neural networks

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

A variety of artificial neural networks are reviewed, including feed‐forward networks, recurrent networks, associative memories such as the Hopfield network, and the self‐organizing map. Their architectures are described, as are methods that have been developed for training them. Particular emphasis is placed on links with statistical activities, especially regression, classification, and clustering, in contexts such as graphical models and latent structure models. In terms of the training procedures, attention is drawn to the implicit or explicit implementation of statistical methodological approaches such as likelihood and Bayesian methods. Copyright © 2009 John Wiley & Sons, Inc.

Figure 1.

Simple perceptron.

[ Normal View | Magnified View ]
Figure 2.

Feed‐forward network with one hidden layer.

[ Normal View | Magnified View ]
Figure 3.

Hopfield network.

[ Normal View | Magnified View ]
Figure 4.

Architecture for self‐organizing map.

[ Normal View | Magnified View ]

Related Articles

Algorithms for Chemoinformatics: an Interdisciplinary View

Browse by Topic

Machine Learning > Neural Networks
blog comments powered by Disqus

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts

Twitter: WileyCompSci Follow us on Twitter

    It's #FCF! Read a FREE chapter from the 2nd edition of Roger Cochetti's Mobile Satellite Communications Handbook. http://t.co/DorvX3O3c6