Transfer learning for chemically accurate interatomic neural network
potentials
- URL: http://arxiv.org/abs/2212.03916v1
- Date: Wed, 7 Dec 2022 19:21:01 GMT
- Title: Transfer learning for chemically accurate interatomic neural network
potentials
- Authors: Viktor Zaverkin, David Holzm\"uller, Luca Bonfirraro, and Johannes
K\"astner
- Abstract summary: We show that pre-training the network parameters on data obtained from density functional calculations improves the sample efficiency of models trained on more accurate ab-initio data.
We provide GM-NN potentials pre-trained and fine-tuned on the ANI-1x and ANI-1ccx data sets, which can easily be fine-tuned on and applied to organic molecules.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Developing machine learning-based interatomic potentials from ab-initio
electronic structure methods remains a challenging task for computational
chemistry and materials science. This work studies the capability of transfer
learning for efficiently generating chemically accurate interatomic neural
network potentials on organic molecules from the MD17 and ANI data sets. We
show that pre-training the network parameters on data obtained from density
functional calculations considerably improves the sample efficiency of models
trained on more accurate ab-initio data. Additionally, we show that fine-tuning
with energy labels alone suffices to obtain accurate atomic forces and run
large-scale atomistic simulations. We also investigate possible limitations of
transfer learning, especially regarding the design and size of the pre-training
and fine-tuning data sets. Finally, we provide GM-NN potentials pre-trained and
fine-tuned on the ANI-1x and ANI-1ccx data sets, which can easily be fine-tuned
on and applied to organic molecules.
Related papers
- Learning and Controlling Silicon Dopant Transitions in Graphene using
Scanning Transmission Electron Microscopy [58.51812955462815]
We introduce a machine learning approach to determine the transition dynamics of silicon atoms on a single layer of carbon atoms.
The data samples are processed and filtered to produce symbolic representations, which we use to train a neural network to predict transition probabilities.
These learned transition dynamics are then leveraged to guide a single silicon atom throughout the lattice to pre-determined target destinations.
arXiv Detail & Related papers (2023-11-21T21:51:00Z) - Gradual Optimization Learning for Conformational Energy Minimization [69.36925478047682]
Gradual Optimization Learning Framework (GOLF) for energy minimization with neural networks significantly reduces the required additional data.
Our results demonstrate that the neural network trained with GOLF performs on par with the oracle on a benchmark of diverse drug-like molecules.
arXiv Detail & Related papers (2023-11-05T11:48:08Z) - Synthetic pre-training for neural-network interatomic potentials [0.0]
We show that synthetic atomistic data, themselves obtained at scale with an existing machine learning potential, constitute a useful pre-training task for neural-network interatomic potential models.
Once pre-trained with a large synthetic dataset, these models can be fine-tuned on a much smaller, quantum-mechanical one, improving numerical accuracy and stability in computational practice.
arXiv Detail & Related papers (2023-07-24T17:16:24Z) - On the Interplay of Subset Selection and Informed Graph Neural Networks [3.091456764812509]
This work focuses on predicting the molecules atomization energy in the QM9 dataset.
We show how maximizing molecular diversity in the training set selection process increases the robustness of linear and nonlinear regression techniques.
We also check the reliability of the predictions made by the graph neural network with a model-agnostic explainer.
arXiv Detail & Related papers (2023-06-15T09:09:27Z) - Transfer learning for atomistic simulations using GNNs and kernel mean
embeddings [24.560340485988128]
We propose a transfer learning algorithm that leverages the ability of graph neural networks (GNNs) to represent chemical environments together with kernel mean embeddings.
We test our approach on a series of realistic datasets of increasing complexity, showing excellent generalization and transferability performance.
arXiv Detail & Related papers (2023-06-02T14:58:16Z) - Accurate Machine Learned Quantum-Mechanical Force Fields for
Biomolecular Simulations [51.68332623405432]
Molecular dynamics (MD) simulations allow atomistic insights into chemical and biological processes.
Recently, machine learned force fields (MLFFs) emerged as an alternative means to execute MD simulations.
This work proposes a general approach to constructing accurate MLFFs for large-scale molecular simulations.
arXiv Detail & Related papers (2022-05-17T13:08:28Z) - Improving Molecular Representation Learning with Metric
Learning-enhanced Optimal Transport [49.237577649802034]
We develop a novel optimal transport-based algorithm termed MROT to enhance their generalization capability for molecular regression problems.
MROT significantly outperforms state-of-the-art models, showing promising potential in accelerating the discovery of new substances.
arXiv Detail & Related papers (2022-02-13T04:56:18Z) - Fast and Sample-Efficient Interatomic Neural Network Potentials for
Molecules and Materials Based on Gaussian Moments [3.1829446824051195]
We present an improved NN architecture based on the previous GM-NN model.
The improved methodology is a pre-requisite for training-heavy such as active learning or learning-on-the-fly.
arXiv Detail & Related papers (2021-09-20T14:23:34Z) - Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials [0.0]
We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
arXiv Detail & Related papers (2021-09-15T16:46:46Z) - Quantum-tailored machine-learning characterization of a superconducting
qubit [50.591267188664666]
We develop an approach to characterize the dynamics of a quantum device and learn device parameters.
This approach outperforms physics-agnostic recurrent neural networks trained on numerically generated and experimental data.
This demonstration shows how leveraging domain knowledge improves the accuracy and efficiency of this characterization task.
arXiv Detail & Related papers (2021-06-24T15:58:57Z) - BIGDML: Towards Exact Machine Learning Force Fields for Materials [55.944221055171276]
Machine-learning force fields (MLFF) should be accurate, computationally and data efficient, and applicable to molecules, materials, and interfaces thereof.
Here, we introduce the Bravais-Inspired Gradient-Domain Machine Learning approach and demonstrate its ability to construct reliable force fields using a training set with just 10-200 atoms.
arXiv Detail & Related papers (2021-06-08T10:14:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.