Fast Uncertainty Estimates in Deep Learning Interatomic Potentials
- URL: http://arxiv.org/abs/2211.09866v1
- Date: Thu, 17 Nov 2022 20:13:39 GMT
- Title: Fast Uncertainty Estimates in Deep Learning Interatomic Potentials
- Authors: Albert Zhu, Simon Batzner, Albert Musaelian, Boris Kozinsky
- Abstract summary: We propose a method to estimate the predictive uncertainty based on a single neural network without the need for an ensemble.
We demonstrate that the quality of the uncertainty estimates matches those obtained from deep ensembles.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning has emerged as a promising paradigm to give access to highly
accurate predictions of molecular and materials properties. A common
short-coming shared by current approaches, however, is that neural networks
only give point estimates of their predictions and do not come with predictive
uncertainties associated with these estimates. Existing uncertainty
quantification efforts have primarily leveraged the standard deviation of
predictions across an ensemble of independently trained neural networks. This
incurs a large computational overhead in both training and prediction that
often results in order-of-magnitude more expensive predictions. Here, we
propose a method to estimate the predictive uncertainty based on a single
neural network without the need for an ensemble. This allows us to obtain
uncertainty estimates with virtually no additional computational overhead over
standard training and inference. We demonstrate that the quality of the
uncertainty estimates matches those obtained from deep ensembles. We further
examine the uncertainty estimates of our methods and deep ensembles across the
configuration space of our test system and compare the uncertainties to the
potential energy surface. Finally, we study the efficacy of the method in an
active learning setting and find the results to match an ensemble-based
strategy at order-of-magnitude reduced computational cost.
Related papers
- Evidential Deep Learning: Enhancing Predictive Uncertainty Estimation
for Earth System Science Applications [0.32302664881848275]
Evidential deep learning is a technique that extends parametric deep learning to higher-order distributions.
This study compares the uncertainty derived from evidential neural networks to those obtained from ensembles.
We show evidential deep learning models attaining predictive accuracy rivaling standard methods, while robustly quantifying both sources of uncertainty.
arXiv Detail & Related papers (2023-09-22T23:04:51Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Learning Uncertainty with Artificial Neural Networks for Improved
Predictive Process Monitoring [0.114219428942199]
We distinguish two types of learnable uncertainty: model uncertainty due to a lack of training data and noise-induced observational uncertainty.
Our contribution is to apply these uncertainty concepts to predictive process monitoring tasks to train uncertainty-based models to predict the remaining time and outcomes.
arXiv Detail & Related papers (2022-06-13T17:05:27Z) - The Unreasonable Effectiveness of Deep Evidential Regression [72.30888739450343]
A new approach with uncertainty-aware regression-based neural networks (NNs) shows promise over traditional deterministic methods and typical Bayesian NNs.
We detail the theoretical shortcomings and analyze the performance on synthetic and real-world data sets, showing that Deep Evidential Regression is a quantification rather than an exact uncertainty.
arXiv Detail & Related papers (2022-05-20T10:10:32Z) - Training Uncertainty-Aware Classifiers with Conformalized Deep Learning [7.837881800517111]
Deep neural networks are powerful tools to detect hidden patterns in data and leverage them to make predictions, but they are not designed to understand uncertainty.
We develop a novel training algorithm that can lead to more dependable uncertainty estimates, without sacrificing predictive power.
arXiv Detail & Related papers (2022-05-12T05:08:10Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Quantifying Uncertainty in Deep Spatiotemporal Forecasting [67.77102283276409]
We describe two types of forecasting problems: regular grid-based and graph-based.
We analyze UQ methods from both the Bayesian and the frequentist point view, casting in a unified framework via statistical decision theory.
Through extensive experiments on real-world road network traffic, epidemics, and air quality forecasting tasks, we reveal the statistical computational trade-offs for different UQ methods.
arXiv Detail & Related papers (2021-05-25T14:35:46Z) - DEUP: Direct Epistemic Uncertainty Prediction [56.087230230128185]
Epistemic uncertainty is part of out-of-sample prediction error due to the lack of knowledge of the learner.
We propose a principled approach for directly estimating epistemic uncertainty by learning to predict generalization error and subtracting an estimate of aleatoric uncertainty.
arXiv Detail & Related papers (2021-02-16T23:50:35Z) - The Benefit of the Doubt: Uncertainty Aware Sensing for Edge Computing
Platforms [10.86298377998459]
We propose an efficient framework for predictive uncertainty estimation in NNs deployed on embedded edge systems.
The framework is built from the ground up to provide predictive uncertainty based only on one forward pass.
Our approach not only obtains robust and accurate uncertainty estimations but also outperforms state-of-the-art methods in terms of systems performance.
arXiv Detail & Related papers (2021-02-11T11:44:32Z) - Depth Uncertainty in Neural Networks [2.6763498831034043]
Existing methods for estimating uncertainty in deep learning tend to require multiple forward passes.
By exploiting the sequential structure of feed-forward networks, we are able to both evaluate our training objective and make predictions with a single forward pass.
We validate our approach on real-world regression and image classification tasks.
arXiv Detail & Related papers (2020-06-15T14:33:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.