Graph Neural Network Interatomic Potential Ensembles with Calibrated
Aleatoric and Epistemic Uncertainty on Energy and Forces
- URL: http://arxiv.org/abs/2305.16325v2
- Date: Mon, 11 Sep 2023 12:15:09 GMT
- Title: Graph Neural Network Interatomic Potential Ensembles with Calibrated
Aleatoric and Epistemic Uncertainty on Energy and Forces
- Authors: Jonas Busk, Mikkel N. Schmidt, Ole Winther, Tejs Vegge and Peter
Bj{\o}rn J{\o}rgensen
- Abstract summary: We present a complete framework for training and recalibrating graph neural network ensemble models to produce accurate predictions of energy and forces.
The proposed method considers both epistemic and aleatoric uncertainty and the total uncertainties are recalibrated post hoc.
A detailed analysis of the predictive performance and uncertainty calibration is provided.
- Score: 9.378581265532006
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Inexpensive machine learning potentials are increasingly being used to speed
up structural optimization and molecular dynamics simulations of materials by
iteratively predicting and applying interatomic forces. In these settings, it
is crucial to detect when predictions are unreliable to avoid wrong or
misleading results. Here, we present a complete framework for training and
recalibrating graph neural network ensemble models to produce accurate
predictions of energy and forces with calibrated uncertainty estimates. The
proposed method considers both epistemic and aleatoric uncertainty and the
total uncertainties are recalibrated post hoc using a nonlinear scaling
function to achieve good calibration on previously unseen data, without loss of
predictive accuracy. The method is demonstrated and evaluated on two
challenging, publicly available datasets, ANI-1x (Smith et al.) and
Transition1x (Schreiner et al.), both containing diverse conformations far from
equilibrium. A detailed analysis of the predictive performance and uncertainty
calibration is provided. In all experiments, the proposed method achieved low
prediction error and good uncertainty calibration, with predicted uncertainty
correlating with expected error, on energy and forces. To the best of our
knowledge, the method presented in this paper is the first to consider a
complete framework for obtaining calibrated epistemic and aleatoric uncertainty
predictions on both energy and forces in ML potentials.
Related papers
- SAUC: Sparsity-Aware Uncertainty Calibration for Spatiotemporal Prediction with Graph Neural Networks [17.994971799054213]
Existing deep learning mostly focuses on prediction, overlooking the inherent uncertainty in such prediction.
This paper introduces a novel post-hoc Sparsity-atemporalwar AUC framework, which calibrates uncertainty in both zero and non-zero values.
Specifically, our empirical experiments show a 20% reduction in the sparse traffic accident and urban crime prediction errors.
arXiv Detail & Related papers (2024-09-13T12:20:02Z) - Calibrated Uncertainty Quantification for Operator Learning via
Conformal Prediction [95.75771195913046]
We propose a risk-controlling quantile neural operator, a distribution-free, finite-sample functional calibration conformal prediction method.
We provide a theoretical calibration guarantee on the coverage rate, defined as the expected percentage of points on the function domain.
Empirical results on a 2D Darcy flow and a 3D car surface pressure prediction task validate our theoretical results.
arXiv Detail & Related papers (2024-02-02T23:43:28Z) - Evidential Deep Learning: Enhancing Predictive Uncertainty Estimation
for Earth System Science Applications [0.32302664881848275]
Evidential deep learning is a technique that extends parametric deep learning to higher-order distributions.
This study compares the uncertainty derived from evidential neural networks to those obtained from ensembles.
We show evidential deep learning models attaining predictive accuracy rivaling standard methods, while robustly quantifying both sources of uncertainty.
arXiv Detail & Related papers (2023-09-22T23:04:51Z) - Confidence and Dispersity Speak: Characterising Prediction Matrix for
Unsupervised Accuracy Estimation [51.809741427975105]
This work aims to assess how well a model performs under distribution shifts without using labels.
We use the nuclear norm that has been shown to be effective in characterizing both properties.
We show that the nuclear norm is more accurate and robust in accuracy than existing methods.
arXiv Detail & Related papers (2023-02-02T13:30:48Z) - Uncertainty Quantification for Traffic Forecasting: A Unified Approach [21.556559649467328]
Uncertainty is an essential consideration for time series forecasting tasks.
In this work, we focus on quantifying the uncertainty of traffic forecasting.
We develop Deep S-Temporal Uncertainty Quantification (STUQ), which can estimate both aleatoric and relational uncertainty.
arXiv Detail & Related papers (2022-08-11T15:21:53Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Quantifying Model Predictive Uncertainty with Perturbation Theory [21.591460685054546]
We propose a framework for predictive uncertainty quantification of a neural network.
We use perturbation theory from quantum physics to formulate a moment decomposition problem.
Our approach provides fast model predictive uncertainty estimates with much greater precision and calibration.
arXiv Detail & Related papers (2021-09-22T17:55:09Z) - When in Doubt: Neural Non-Parametric Uncertainty Quantification for
Epidemic Forecasting [70.54920804222031]
Most existing forecasting models disregard uncertainty quantification, resulting in mis-calibrated predictions.
Recent works in deep neural models for uncertainty-aware time-series forecasting also have several limitations.
We model the forecasting task as a probabilistic generative process and propose a functional neural process model called EPIFNP.
arXiv Detail & Related papers (2021-06-07T18:31:47Z) - Improving model calibration with accuracy versus uncertainty
optimization [17.056768055368384]
A well-calibrated model should be accurate when it is certain about its prediction and indicate high uncertainty when it is likely to be inaccurate.
We propose an optimization method that leverages the relationship between accuracy and uncertainty as an anchor for uncertainty calibration.
We demonstrate our approach with mean-field variational inference and compare with state-of-the-art methods.
arXiv Detail & Related papers (2020-12-14T20:19:21Z) - Balance-Subsampled Stable Prediction [55.13512328954456]
We propose a novel balance-subsampled stable prediction (BSSP) algorithm based on the theory of fractional factorial design.
A design-theoretic analysis shows that the proposed method can reduce the confounding effects among predictors induced by the distribution shift.
Numerical experiments on both synthetic and real-world data sets demonstrate that our BSSP algorithm significantly outperforms the baseline methods for stable prediction across unknown test data.
arXiv Detail & Related papers (2020-06-08T07:01:38Z) - Learning to Predict Error for MRI Reconstruction [67.76632988696943]
We demonstrate that predictive uncertainty estimated by the current methods does not highly correlate with prediction error.
We propose a novel method that estimates the target labels and magnitude of the prediction error in two steps.
arXiv Detail & Related papers (2020-02-13T15:55:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.