Stability-Aware Training of Neural Network Interatomic Potentials with
Differentiable Boltzmann Estimators
- URL: http://arxiv.org/abs/2402.13984v1
- Date: Wed, 21 Feb 2024 18:12:07 GMT
- Title: Stability-Aware Training of Neural Network Interatomic Potentials with
Differentiable Boltzmann Estimators
- Authors: Sanjeev Raja, Ishan Amin, Fabian Pedregosa, Aditi S. Krishnapriyan
- Abstract summary: We present Stability-Aware Boltzmann Estimator (StABlE) Training to produce stable and accurate NNIPs.
StABlE Training iteratively runs MD simulations to seek out unstable regions, and corrects the instabilities via supervision with a reference observable.
We demonstrate our methodology across organic molecules, tetrapeptides, and condensed phase systems, along with using three modern NNIP architectures.
- Score: 12.781228780303184
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural network interatomic potentials (NNIPs) are an attractive alternative
to ab-initio methods for molecular dynamics (MD) simulations. However, they can
produce unstable simulations which sample unphysical states, limiting their
usefulness for modeling phenomena occurring over longer timescales. To address
these challenges, we present Stability-Aware Boltzmann Estimator (StABlE)
Training, a multi-modal training procedure which combines conventional
supervised training from quantum-mechanical energies and forces with reference
system observables, to produce stable and accurate NNIPs. StABlE Training
iteratively runs MD simulations to seek out unstable regions, and corrects the
instabilities via supervision with a reference observable. The training
procedure is enabled by the Boltzmann Estimator, which allows efficient
computation of gradients required to train neural networks to system
observables, and can detect both global and local instabilities. We demonstrate
our methodology across organic molecules, tetrapeptides, and condensed phase
systems, along with using three modern NNIP architectures. In all three cases,
StABlE-trained models achieve significant improvements in simulation stability
and recovery of structural and dynamic observables. In some cases,
StABlE-trained models outperform conventional models trained on datasets 50
times larger. As a general framework applicable across NNIP architectures and
systems, StABlE Training is a powerful tool for training stable and accurate
NNIPs, particularly in the absence of large reference datasets.
Related papers
- Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - A Multi-Grained Symmetric Differential Equation Model for Learning
Protein-Ligand Binding Dynamics [74.93549765488103]
In drug discovery, molecular dynamics simulation provides a powerful tool for predicting binding affinities, estimating transport properties, and exploring pocket sites.
We propose NeuralMD, the first machine learning surrogate that can facilitate numerical MD and provide accurate simulations in protein-ligand binding.
We show the efficiency and effectiveness of NeuralMD, with a 2000$times$ speedup over standard numerical MD simulation and outperforming all other ML approaches by up to 80% under the stability metric.
arXiv Detail & Related papers (2024-01-26T09:35:17Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Neural net modeling of equilibria in NSTX-U [0.0]
We develop two neural networks relevant to equilibrium and shape control modeling.
Networks include Eqnet, a free-boundary equilibrium solver trained on the EFIT01 reconstruction algorithm, and Pertnet, which is trained on the Gspert code.
We report strong performance for both networks indicating that these models could reliably be used within closed-loop simulations.
arXiv Detail & Related papers (2022-02-28T16:09:58Z) - Learning Stochastic Dynamics with Statistics-Informed Neural Network [0.4297070083645049]
We introduce a machine-learning framework named statistics-informed neural network (SINN) for learning dynamics from data.
We devise mechanisms for training the neural network model to reproduce the correct emphstatistical behavior of a target process.
We show that the obtained reduced-order model can be trained on temporally coarse-grained data and hence is well suited for rare-event simulations.
arXiv Detail & Related papers (2022-02-24T18:21:01Z) - Using scientific machine learning for experimental bifurcation analysis
of dynamic systems [2.204918347869259]
This study focuses on training universal differential equation (UDE) models for physical nonlinear dynamical systems with limit cycles.
We consider examples where training data is generated by numerical simulations, whereas we also employ the proposed modelling concept to physical experiments.
We use both neural networks and Gaussian processes as universal approximators alongside the mechanistic models to give a critical assessment of the accuracy and robustness of the UDE modelling approach.
arXiv Detail & Related papers (2021-10-22T15:43:03Z) - Fast and Sample-Efficient Interatomic Neural Network Potentials for
Molecules and Materials Based on Gaussian Moments [3.1829446824051195]
We present an improved NN architecture based on the previous GM-NN model.
The improved methodology is a pre-requisite for training-heavy such as active learning or learning-on-the-fly.
arXiv Detail & Related papers (2021-09-20T14:23:34Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z) - SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate
Interatomic Potentials [0.17590081165362778]
NequIP is a SE(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations.
The method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency.
arXiv Detail & Related papers (2021-01-08T18:49:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.