Simple and efficient algorithms for training machine learning potentials
to force data
- URL: http://arxiv.org/abs/2006.05475v1
- Date: Tue, 9 Jun 2020 19:36:40 GMT
- Title: Simple and efficient algorithms for training machine learning potentials
to force data
- Authors: Justin S. Smith, Nicholas Lubbers, Aidan P. Thompson, Kipton Barros
- Abstract summary: Machine learning models, trained on data from ab initio quantum simulations, are yielding molecular dynamics potentials with unprecedented accuracy.
One limiting factor is the quantity of available training data, which can be expensive to obtain.
We present a new algorithm for efficient force training, and benchmark its accuracy by training to forces from real-world datasets for organic chemistry and bulk aluminum.
- Score: 2.924868086534434
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Abstract Machine learning models, trained on data from ab initio quantum
simulations, are yielding molecular dynamics potentials with unprecedented
accuracy. One limiting factor is the quantity of available training data, which
can be expensive to obtain. A quantum simulation often provides all atomic
forces, in addition to the total energy of the system. These forces provide
much more information than the energy alone. It may appear that training a
model to this large quantity of force data would introduce significant
computational costs. Actually, training to all available force data should only
be a few times more expensive than training to energies alone. Here, we present
a new algorithm for efficient force training, and benchmark its accuracy by
training to forces from real-world datasets for organic chemistry and bulk
aluminum.
Related papers
- Physical Consistency Bridges Heterogeneous Data in Molecular Multi-Task Learning [79.75718786477638]
We exploit the specialty of molecular tasks that there are physical laws connecting them, and design consistency training approaches.
We demonstrate that the more accurate energy data can improve the accuracy of structure prediction.
We also find that consistency training can directly leverage force and off-equilibrium structure data to improve structure prediction.
arXiv Detail & Related papers (2024-10-14T03:11:33Z) - Physics-Informed Weakly Supervised Learning for Interatomic Potentials [17.165117198519248]
We introduce a physics-informed, weakly supervised approach for training machine-learned interatomic potentials.
We demonstrate reduced energy and force errors -- often lower by a factor of two -- for various baseline models and benchmark data sets.
arXiv Detail & Related papers (2024-07-23T12:49:04Z) - Quantum Hardware-Enabled Molecular Dynamics via Transfer Learning [1.9144534010016192]
We propose a new path forward for molecular dynamics simulations on quantum hardware.
By combining transfer learning with techniques for building machine-learned potential energy surfaces, we propose a new path forward.
We demonstrate this approach by training machine learning models to predict a molecule's potential energy using Behler-Parrinello neural networks.
arXiv Detail & Related papers (2024-06-12T18:00:09Z) - Machine Learning Force Fields with Data Cost Aware Training [94.78998399180519]
Machine learning force fields (MLFF) have been proposed to accelerate molecular dynamics (MD) simulation.
Even for the most data-efficient MLFFs, reaching chemical accuracy can require hundreds of frames of force and energy labels.
We propose a multi-stage computational framework -- ASTEROID, which lowers the data cost of MLFFs by leveraging a combination of cheap inaccurate data and expensive accurate data.
arXiv Detail & Related papers (2023-06-05T04:34:54Z) - Multi-Fidelity Machine Learning for Excited State Energies of Molecules [0.0]
It is shown that the multi-fidelity machine learning model can achieve the same accuracy as a machine learning model built only on high cost training data.
The numerical gain observed in these benchmark test calculations was over a factor of 30 but certainly can be much higher for high accuracy data.
arXiv Detail & Related papers (2023-05-18T20:21:22Z) - Hindsight States: Blending Sim and Real Task Elements for Efficient
Reinforcement Learning [61.3506230781327]
In robotics, one approach to generate training data builds on simulations based on dynamics models derived from first principles.
Here, we leverage the imbalance in complexity of the dynamics to learn more sample-efficiently.
We validate our method on several challenging simulated tasks and demonstrate that it improves learning both alone and when combined with an existing hindsight algorithm.
arXiv Detail & Related papers (2023-03-03T21:55:04Z) - Transfer learning for chemically accurate interatomic neural network
potentials [0.0]
We show that pre-training the network parameters on data obtained from density functional calculations improves the sample efficiency of models trained on more accurate ab-initio data.
We provide GM-NN potentials pre-trained and fine-tuned on the ANI-1x and ANI-1ccx data sets, which can easily be fine-tuned on and applied to organic molecules.
arXiv Detail & Related papers (2022-12-07T19:21:01Z) - Continual learning autoencoder training for a particle-in-cell
simulation via streaming [52.77024349608834]
upcoming exascale era will provide a new generation of physics simulations with high resolution.
These simulations will have a high resolution, which will impact the training of machine learning models since storing a high amount of simulation data on disk is nearly impossible.
This work presents an approach that trains a neural network concurrently to a running simulation without data on a disk.
arXiv Detail & Related papers (2022-11-09T09:55:14Z) - Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials [0.0]
We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
arXiv Detail & Related papers (2021-09-15T16:46:46Z) - Quantum-tailored machine-learning characterization of a superconducting
qubit [50.591267188664666]
We develop an approach to characterize the dynamics of a quantum device and learn device parameters.
This approach outperforms physics-agnostic recurrent neural networks trained on numerically generated and experimental data.
This demonstration shows how leveraging domain knowledge improves the accuracy and efficiency of this characterization task.
arXiv Detail & Related papers (2021-06-24T15:58:57Z) - ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations [86.41674945012369]
We develop a scalable and expressive Graph Neural Networks model, ForceNet, to approximate atomic forces.
Our proposed ForceNet is able to predict atomic forces more accurately than state-of-the-art physics-based GNNs.
arXiv Detail & Related papers (2021-03-02T03:09:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.