Wavelet Scattering Networks for Atomistic Systems with Extrapolation of
Material Properties
- URL: http://arxiv.org/abs/2006.01247v2
- Date: Fri, 17 Jul 2020 01:34:36 GMT
- Title: Wavelet Scattering Networks for Atomistic Systems with Extrapolation of
Material Properties
- Authors: Paul Sinz and Michael W. Swift and Xavier Brumwell and Jialin Liu and
Kwang Jin Kim and Yue Qi and Matthew Hirn
- Abstract summary: A dream of machine learning in materials science is for a model to learn the underlying physics of an atomic system.
In this work, we test the generalizability of our $textLi_alphatextSi$ energy predictor to properties that were not included in the training set.
- Score: 7.555136209115944
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The dream of machine learning in materials science is for a model to learn
the underlying physics of an atomic system, allowing it to move beyond
interpolation of the training set to the prediction of properties that were not
present in the original training data. In addition to advances in machine
learning architectures and training techniques, achieving this ambitious goal
requires a method to convert a 3D atomic system into a feature representation
that preserves rotational and translational symmetry, smoothness under small
perturbations, and invariance under re-ordering. The atomic orbital wavelet
scattering transform preserves these symmetries by construction, and has
achieved great success as a featurization method for machine learning energy
prediction. Both in small molecules and in the bulk amorphous
$\text{Li}_{\alpha}\text{Si}$ system, machine learning models using wavelet
scattering coefficients as features have demonstrated a comparable accuracy to
Density Functional Theory at a small fraction of the computational cost. In
this work, we test the generalizability of our $\text{Li}_{\alpha}\text{Si}$
energy predictor to properties that were not included in the training set, such
as elastic constants and migration barriers. We demonstrate that statistical
feature selection methods can reduce over-fitting and lead to remarkable
accuracy in these extrapolation tasks.
Related papers
- Model-free quantification of completeness, uncertainties, and outliers in atomistic machine learning using information theory [4.59916193837551]
atomistic machine learning (ML) often relies on unsupervised learning or model predictions to analyze information contents.
Here, we introduce a theoretical framework that provides a rigorous, model-free tool to quantify information contents in atomistic simulations.
arXiv Detail & Related papers (2024-04-18T17:50:15Z) - Electronic Structure Prediction of Multi-million Atom Systems Through Uncertainty Quantification Enabled Transfer Learning [5.4875371069660925]
Ground state electron density -- obtainable using Kohn-Sham Density Functional Theory (KS-DFT) simulations -- contains a wealth of material information.
However, the computational expense of KS-DFT scales cubically with system size which tends to stymie training data generation.
Here, we address this fundamental challenge by employing transfer learning to leverage the multi-scale nature of the training data.
arXiv Detail & Related papers (2023-08-24T21:41:29Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Hessian-based toolbox for reliable and interpretable machine learning in
physics [58.720142291102135]
We present a toolbox for interpretability and reliability, extrapolation of the model architecture.
It provides a notion of the influence of the input data on the prediction at a given test point, an estimation of the uncertainty of the model predictions, and an agnostic score for the model predictions.
Our work opens the road to the systematic use of interpretability and reliability methods in ML applied to physics and, more generally, science.
arXiv Detail & Related papers (2021-08-04T16:32:59Z) - Quantum-tailored machine-learning characterization of a superconducting
qubit [50.591267188664666]
We develop an approach to characterize the dynamics of a quantum device and learn device parameters.
This approach outperforms physics-agnostic recurrent neural networks trained on numerically generated and experimental data.
This demonstration shows how leveraging domain knowledge improves the accuracy and efficiency of this characterization task.
arXiv Detail & Related papers (2021-06-24T15:58:57Z) - SE(3)-equivariant prediction of molecular wavefunctions and electronic
densities [4.2572103161049055]
We introduce general SE(3)-equivariant operations and building blocks for constructing deep learning architectures for geometric point cloud data.
Our model reduces prediction errors by up to two orders of magnitude compared to the previous state-of-the-art.
We demonstrate the potential of our approach in a transfer learning application, where a model trained on low accuracy reference wavefunctions implicitly learns to correct for electronic many-body interactions.
arXiv Detail & Related papers (2021-06-04T08:57:46Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Automated discovery of a robust interatomic potential for aluminum [4.6028828826414925]
Machine learning (ML) based potentials aim for faithful emulation of quantum mechanics (QM) calculations at drastically reduced computational cost.
We present a highly automated approach to dataset construction using the principles of active learning (AL)
We demonstrate this approach by building an ML potential for aluminum (ANI-Al)
To demonstrate transferability, we perform a 1.3M atom shock simulation, and show that ANI-Al predictions agree very well with DFT calculations on local atomic environments sampled from the nonequilibrium dynamics.
arXiv Detail & Related papers (2020-03-10T19:06:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.