Generalizability of Graph Neural Network Force Fields for Predicting Solid-State Properties
- URL: http://arxiv.org/abs/2409.09931v1
- Date: Mon, 16 Sep 2024 02:14:26 GMT
- Title: Generalizability of Graph Neural Network Force Fields for Predicting Solid-State Properties
- Authors: Shaswat Mohanty, Yifan Wang, Wei Cai,
- Abstract summary: Machine-learned force fields (MLFFs) promise to offer a computationally efficient alternative to ab initio simulations for complex molecular systems.
This work investigates the ability of a graph neural network (GNN)-based MLFF to describe solid-state phenomena not explicitly included during training.
- Score: 8.405078403907241
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Machine-learned force fields (MLFFs) promise to offer a computationally efficient alternative to ab initio simulations for complex molecular systems. However, ensuring their generalizability beyond training data is crucial for their wide application in studying solid materials. This work investigates the ability of a graph neural network (GNN)-based MLFF, trained on Lennard-Jones Argon, to describe solid-state phenomena not explicitly included during training. We assess the MLFF's performance in predicting phonon density of states (PDOS) for a perfect face-centered cubic (FCC) crystal structure at both zero and finite temperatures. Additionally, we evaluate vacancy migration rates and energy barriers in an imperfect crystal using direct molecular dynamics (MD) simulations and the string method. Notably, vacancy configurations were absent from the training data. Our results demonstrate the MLFF's capability to capture essential solid-state properties with good agreement to reference data, even for unseen configurations. We further discuss data engineering strategies to enhance the generalizability of MLFFs. The proposed set of benchmark tests and workflow for evaluating MLFF performance in describing perfect and imperfect crystals pave the way for reliable application of MLFFs in studying complex solid-state materials.
Related papers
- Overcoming systematic softening in universal machine learning interatomic potentials by fine-tuning [3.321322648845526]
Machine learning interatomic potentials (MLIPs) have introduced a new paradigm for atomic simulations.
Recent advancements have seen the emergence of universal MLIPs (uMLIPs) that are pre-trained on diverse materials datasets.
We show that their performance in extrapolating to out-of-distribution complex atomic environments remains unclear.
arXiv Detail & Related papers (2024-05-11T22:30:47Z) - EL-MLFFs: Ensemble Learning of Machine Leaning Force Fields [1.8367772188990783]
Machine learning force fields (MLFFs) have emerged as a promising approach to bridge the accuracy of quantum mechanical methods.
We propose a novel ensemble learning framework, EL-MLFFs, which leverages the stacking method to integrate predictions from diverse MLFFs.
We evaluate our approach on two distinct datasets: methane molecules and methanol adsorbed on a Cu(100) surface.
arXiv Detail & Related papers (2024-03-26T09:09:40Z) - Stability-Aware Training of Machine Learning Force Fields with Differentiable Boltzmann Estimators [11.699834591020057]
Stability-Aware Boltzmann Estimator (StABlE) Training is a multi-modal training procedure which leverages joint supervision from reference quantum-mechanical calculations and system observables.
StABlE Training can be viewed as a general semi-empirical framework applicable across MLFF architectures and systems.
arXiv Detail & Related papers (2024-02-21T18:12:07Z) - In Situ Framework for Coupling Simulation and Machine Learning with
Application to CFD [51.04126395480625]
Recent years have seen many successful applications of machine learning (ML) to facilitate fluid dynamic computations.
As simulations grow, generating new training datasets for traditional offline learning creates I/O and storage bottlenecks.
This work offers a solution by simplifying this coupling and enabling in situ training and inference on heterogeneous clusters.
arXiv Detail & Related papers (2023-06-22T14:07:54Z) - Spherical Fourier Neural Operators: Learning Stable Dynamics on the
Sphere [53.63505583883769]
We introduce Spherical FNOs (SFNOs) for learning operators on spherical geometries.
SFNOs have important implications for machine learning-based simulation of climate dynamics.
arXiv Detail & Related papers (2023-06-06T16:27:17Z) - Machine Learning Force Fields with Data Cost Aware Training [94.78998399180519]
Machine learning force fields (MLFF) have been proposed to accelerate molecular dynamics (MD) simulation.
Even for the most data-efficient MLFFs, reaching chemical accuracy can require hundreds of frames of force and energy labels.
We propose a multi-stage computational framework -- ASTEROID, which lowers the data cost of MLFFs by leveraging a combination of cheap inaccurate data and expensive accurate data.
arXiv Detail & Related papers (2023-06-05T04:34:54Z) - Evaluating the Transferability of Machine-Learned Force Fields for
Material Property Modeling [2.494740426749958]
We present a more comprehensive set of benchmarking tests for evaluating the transferability of machine-learned force fields.
We use a graph neural network (GNN)-based force field coupled with the OpenMM package to carry out MD simulations for Argon.
Our results show that the model can accurately capture the behavior of the solid phase only when the configurations from the solid phase are included in the training dataset.
arXiv Detail & Related papers (2023-01-10T00:25:48Z) - Forces are not Enough: Benchmark and Critical Evaluation for Machine
Learning Force Fields with Molecular Simulations [5.138982355658199]
Molecular dynamics (MD) simulation techniques are widely used for various natural science applications.
We benchmark a collection of state-of-the-art (SOTA) ML FF models and illustrate, in particular, how the commonly benchmarked force accuracy is not well aligned with relevant simulation metrics.
arXiv Detail & Related papers (2022-10-13T17:59:03Z) - Pre-training via Denoising for Molecular Property Prediction [53.409242538744444]
We describe a pre-training technique that utilizes large datasets of 3D molecular structures at equilibrium.
Inspired by recent advances in noise regularization, our pre-training objective is based on denoising.
arXiv Detail & Related papers (2022-05-31T22:28:34Z) - BIGDML: Towards Exact Machine Learning Force Fields for Materials [55.944221055171276]
Machine-learning force fields (MLFF) should be accurate, computationally and data efficient, and applicable to molecules, materials, and interfaces thereof.
Here, we introduce the Bravais-Inspired Gradient-Domain Machine Learning approach and demonstrate its ability to construct reliable force fields using a training set with just 10-200 atoms.
arXiv Detail & Related papers (2021-06-08T10:14:57Z) - Machine Learning Force Fields [54.48599172620472]
Machine Learning (ML) has enabled numerous advances in computational chemistry.
One of the most promising applications is the construction of ML-based force fields (FFs)
This review gives an overview of applications of ML-FFs and the chemical insights that can be obtained from them.
arXiv Detail & Related papers (2020-10-14T13:14:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.