Home
This Title All WIREs
WIREs RSS Feed
How to cite this WIREs title:
WIREs Comp Stat

Deep learning: Computational aspects

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Abstract In this article, we review computational aspects of deep learning (DL). DL uses network architectures consisting of hierarchical layers of latent variables to construct predictors for high‐dimensional input–output models. Training a DL architecture is computationally intensive, and efficient linear algebra library is the key for training and inference. Stochastic gradient descent (SGD) optimization and batch sampling are used to learn from massive datasets. This article is categorized under: Statistical Learning and Exploratory Methods of the Data Sciences > Deep Learning Statistical Learning and Exploratory Methods of the Data Sciences > Modeling Methods Statistical Learning and Exploratory Methods of the Data Sciences > Neural Networks
Tree form representation of composition of affine and sigmoid functions:
[ Normal View | Magnified View ]
Deep learning system hierarchy
[ Normal View | Magnified View ]
Random grid search for hyperparameters
[ Normal View | Magnified View ]

Browse by Topic

Statistical Learning and Exploratory Methods of the Data Sciences > Deep Learning
Statistical Learning and Exploratory Methods of the Data Sciences > Neural Networks
Statistical Learning and Exploratory Methods of the Data Sciences > Modeling Methods

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts