A General Framework for Uncertainty Quantification via Neural SDE-RNN
- URL: http://arxiv.org/abs/2306.01189v1
- Date: Thu, 1 Jun 2023 22:59:45 GMT
- Title: A General Framework for Uncertainty Quantification via Neural SDE-RNN
- Authors: Shweta Dahale, Sai Munikoti, Balasubramaniam Natarajan
- Abstract summary: Uncertainty quantification is a critical yet unsolved challenge for deep learning.
We propose a novel framework based on the principles of recurrent neural networks and neural differential equations for reconciling irregularly sampled measurements.
Our experiments on the IEEE 37 bus test system reveal that our framework can outperform state-of-the-art uncertainty quantification approaches for time-series data imputations.
- Score: 0.3314882635954751
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Uncertainty quantification is a critical yet unsolved challenge for deep
learning, especially for the time series imputation with irregularly sampled
measurements. To tackle this problem, we propose a novel framework based on the
principles of recurrent neural networks and neural stochastic differential
equations for reconciling irregularly sampled measurements. We impute
measurements at any arbitrary timescale and quantify the uncertainty in the
imputations in a principled manner. Specifically, we derive analytical
expressions for quantifying and propagating the epistemic and aleatoric
uncertainty across time instants. Our experiments on the IEEE 37 bus test
distribution system reveal that our framework can outperform state-of-the-art
uncertainty quantification approaches for time-series data imputations.
Related papers
- Score Matching-based Pseudolikelihood Estimation of Neural Marked
Spatio-Temporal Point Process with Uncertainty Quantification [59.81904428056924]
We introduce SMASH: a Score MAtching estimator for learning markedPs with uncertainty quantification.
Specifically, our framework adopts a normalization-free objective by estimating the pseudolikelihood of markedPs through score-matching.
The superior performance of our proposed framework is demonstrated through extensive experiments in both event prediction and uncertainty quantification.
arXiv Detail & Related papers (2023-10-25T02:37:51Z) - Neural State-Space Models: Empirical Evaluation of Uncertainty
Quantification [0.0]
This paper presents preliminary results on uncertainty quantification for system identification with neural state-space models.
We frame the learning problem in a Bayesian probabilistic setting and obtain posterior distributions for the neural network's weights and outputs.
Based on the posterior, we construct credible intervals on the outputs and define a surprise index which can effectively diagnose usage of the model in a potentially dangerous out-of-distribution regime.
arXiv Detail & Related papers (2023-04-13T08:57:33Z) - The Unreasonable Effectiveness of Deep Evidential Regression [72.30888739450343]
A new approach with uncertainty-aware regression-based neural networks (NNs) shows promise over traditional deterministic methods and typical Bayesian NNs.
We detail the theoretical shortcomings and analyze the performance on synthetic and real-world data sets, showing that Deep Evidential Regression is a quantification rather than an exact uncertainty.
arXiv Detail & Related papers (2022-05-20T10:10:32Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Dense Uncertainty Estimation via an Ensemble-based Conditional Latent
Variable Model [68.34559610536614]
We argue that the aleatoric uncertainty is an inherent attribute of the data and can only be correctly estimated with an unbiased oracle model.
We propose a new sampling and selection strategy at train time to approximate the oracle model for aleatoric uncertainty estimation.
Our results show that our solution achieves both accurate deterministic results and reliable uncertainty estimation.
arXiv Detail & Related papers (2021-11-22T08:54:10Z) - PI3NN: Prediction intervals from three independently trained neural
networks [4.714371905733244]
We propose a novel prediction interval method to learn prediction mean values, lower and upper bounds of prediction intervals from three independently trained neural networks.
Our method requires no distributional assumption on data, does not introduce unusual hyper parameters to either the neural network models or the loss function.
Numerical experiments on benchmark regression problems show that our method outperforms the state-of-the-art methods with respect to predictive uncertainty quality, robustness, and identification of out-of-distribution samples.
arXiv Detail & Related papers (2021-08-05T00:55:20Z) - Can a single neuron learn predictive uncertainty? [0.0]
We introduce a novel non-parametric quantile estimation method for continuous random variables based on the simplest neural network architecture with one degree of freedom: a single neuron.
In real-world applications, the method can be used to quantify predictive uncertainty under the split conformal prediction setting.
arXiv Detail & Related papers (2021-06-07T15:12:47Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - The Variational Method of Moments [65.91730154730905]
conditional moment problem is a powerful formulation for describing structural causal parameters in terms of observables.
Motivated by a variational minimax reformulation of OWGMM, we define a very general class of estimators for the conditional moment problem.
We provide algorithms for valid statistical inference based on the same kind of variational reformulations.
arXiv Detail & Related papers (2020-12-17T07:21:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.