This Title All WIREs
How to cite this WIREs title:
WIREs Comput Mol Sci
Impact Factor: 25.113

Machine learning and artificial neural network accelerated computational discoveries in materials science

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Abstract Artificial intelligence (AI) has been referred to as the “fourth paradigm of science,” and as part of a coherent toolbox of data‐driven approaches, machine learning (ML) dramatically accelerates the computational discoveries. As the machinery for ML algorithms matures, significant advances have been made not only by the mainstream AI researchers, but also those work in computational materials science. The number of ML and artificial neural network (ANN) applications in the computational materials science is growing at an astounding rate. This perspective briefly reviews the state‐of‐the‐art progress in some supervised and unsupervised methods with their respective applications. The characteristics of primary ML and ANN algorithms are first described. Then, the most critical applications of AI in computational materials science such as empirical interatomic potential development, ML‐based potential, property predictions, and molecular discoveries using generative adversarial networks (GAN) are comprehensively reviewed. The central ideas underlying these ML applications are discussed, and future directions for integrating ML with computational materials science are given. Finally, a discussion on the applicability and limitations of current ML techniques and the remaining challenges are summarized. This article is categorized under: Computer and Information Science > Chemoinformatics. Structure and Mechanism > Computational Materials Science. Computer and Information Science > Computer Algorithms and Programming. Software > Molecular Modeling.
Schematics of (a) conventional and (b) new relationships among artificial intelligence, machine learning, and deep learning. In the new diagram, AI represents a system that is “intelligent” through rules. Deep learning is multilayered models that learn representations of data with multiple levels of abstraction, while machine learning stands for self‐learning algorithms that learn models from “data”
[ Normal View | Magnified View ]
Generative adversarial networks (GAN) enabled transitioning metasurface design. (a) Illustration of conventional methods. (b) The architecture of the proposed GAN model.(Reprinted with permission from Reference . Copyright 2019 ACS Publications)
[ Normal View | Magnified View ]
Machine learning directed search for ultraincompressible, superhard materials. (Reprinted with permission from Reference . Copyright 2019 ACS Publications)
[ Normal View | Magnified View ]
Machine learning results of (a) linear regression, (b) second order polynomial regression, (c) decision trees (DT), (d) random forests (RF), (e) 3‐10‐1, (f) 3‐20‐1, (g) 3‐10‐10‐1, and (h) 3‐20‐20‐1 artificial neural network (ANN). The red and black square dots represent predicted and the target R values, respectively.(Reprinted with permission from Reference . Copyright 2019 The Royal Society of Chemistry)
[ Normal View | Magnified View ]
Pearson correlation coefficient map between different materials properties. Htcp (heat capacity), thcd (thermal conductivity), debye (Debye temperature), melt (melting point), dens (density), spdl (speed of sound longitudinal), spdt (speed of sound transverse), elam (elastic modulus), blkm (bulk modulus), thex (thermal expansion coefficient), and unitc (unit cell volume).(Reprinted with permission from Reference . Copyright 2019 Nature Research)
[ Normal View | Magnified View ]
Schematic of the convolution neural network to predict the effective thermal conductivity of composite materials.(Reprinted with permission from Reference . Copyright 2019 ScienceDirect)
[ Normal View | Magnified View ]
Comparison of the Gaussian approximation potential (GAP) model with other popular empirical interatomic potentials (EIPs) on the phonon spectrum of graphene. It can be observed that the GAP model has the highest accuracy reproducing the experimentally determined phonon spectrum overall the whole wave vector range.(Reprinted with permission from Reference . Copyright 2019 APS Physics)
[ Normal View | Magnified View ]
The general workflow of empirical interatomic potential development using machine learning (ML) algorithms.(Reprinted with permission from Reference . Copyright 2019 ACS Publications)
[ Normal View | Magnified View ]
(a) Schematic of a multilayer feed‐forward deep neural network. The red, green, and purple circles represent input, hidden, and output neurons, respectively. (b) A typical machine learning workflow of material property predictions. The structure of the atomistic system is first given. Specific mathematical representation is associated with the input structure information and feed into the artificial neural network (ANN) as training features. Than trained model can then make predictions on new instances
[ Normal View | Magnified View ]

Browse by Topic

Software > Molecular Modeling
Computer and Information Science > Computer Algorithms and Programming
Structure and Mechanism > Computational Materials Science
Computer and Information Science > Chemoinformatics

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts