SDE-Net: Equipping Deep Neural Networks with Uncertainty Estimates
- URL: http://arxiv.org/abs/2008.10546v1
- Date: Mon, 24 Aug 2020 16:33:54 GMT
- Title: SDE-Net: Equipping Deep Neural Networks with Uncertainty Estimates
- Authors: Lingkai Kong, Jimeng Sun and Chao Zhang
- Abstract summary: Uncertainty quantification is a fundamental yet unsolved problem for deep learning.
Bayesian framework provides principled way of uncertainty estimation but is often not scalable to modern deep neural nets (DNNs)
We propose a new method for quantifying uncertainties of DNNs from a dynamical system perspective.
- Score: 45.43024126674237
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Uncertainty quantification is a fundamental yet unsolved problem for deep
learning. The Bayesian framework provides a principled way of uncertainty
estimation but is often not scalable to modern deep neural nets (DNNs) that
have a large number of parameters. Non-Bayesian methods are simple to implement
but often conflate different sources of uncertainties and require huge
computing resources. We propose a new method for quantifying uncertainties of
DNNs from a dynamical system perspective. The core of our method is to view DNN
transformations as state evolution of a stochastic dynamical system and
introduce a Brownian motion term for capturing epistemic uncertainty. Based on
this perspective, we propose a neural stochastic differential equation model
(SDE-Net) which consists of (1) a drift net that controls the system to fit the
predictive function; and (2) a diffusion net that captures epistemic
uncertainty. We theoretically analyze the existence and uniqueness of the
solution to SDE-Net. Our experiments demonstrate that the SDE-Net model can
outperform existing uncertainty estimation methods across a series of tasks
where uncertainty plays a fundamental role.
Related papers
- Uncertainty Quantification for Forward and Inverse Problems of PDEs via
Latent Global Evolution [110.99891169486366]
We propose a method that integrates efficient and precise uncertainty quantification into a deep learning-based surrogate model.
Our method endows deep learning-based surrogate models with robust and efficient uncertainty quantification capabilities for both forward and inverse problems.
Our method excels at propagating uncertainty over extended auto-regressive rollouts, making it suitable for scenarios involving long-term predictions.
arXiv Detail & Related papers (2024-02-13T11:22:59Z) - Neural State-Space Models: Empirical Evaluation of Uncertainty
Quantification [0.0]
This paper presents preliminary results on uncertainty quantification for system identification with neural state-space models.
We frame the learning problem in a Bayesian probabilistic setting and obtain posterior distributions for the neural network's weights and outputs.
Based on the posterior, we construct credible intervals on the outputs and define a surprise index which can effectively diagnose usage of the model in a potentially dangerous out-of-distribution regime.
arXiv Detail & Related papers (2023-04-13T08:57:33Z) - The Unreasonable Effectiveness of Deep Evidential Regression [72.30888739450343]
A new approach with uncertainty-aware regression-based neural networks (NNs) shows promise over traditional deterministic methods and typical Bayesian NNs.
We detail the theoretical shortcomings and analyze the performance on synthetic and real-world data sets, showing that Deep Evidential Regression is a quantification rather than an exact uncertainty.
arXiv Detail & Related papers (2022-05-20T10:10:32Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Accurate and Reliable Forecasting using Stochastic Differential
Equations [48.21369419647511]
It is critical yet challenging for deep learning models to properly characterize uncertainty that is pervasive in real-world environments.
This paper develops SDE-HNN to characterize the interaction between the predictive mean and variance of HNNs for accurate and reliable regression.
Experiments on the challenging datasets show that our method significantly outperforms the state-of-the-art baselines in terms of both predictive performance and uncertainty quantification.
arXiv Detail & Related papers (2021-03-28T04:18:11Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Probabilistic solution of chaotic dynamical system inverse problems
using Bayesian Artificial Neural Networks [0.0]
Inverse problems for chaotic systems are numerically challenging.
Small perturbations in model parameters can cause very large changes in estimated forward trajectories.
Bizarre Artificial Neural Networks can be used to simultaneously fit a model and estimate model parameter uncertainty.
arXiv Detail & Related papers (2020-05-26T20:35:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.