Accurate and Reliable Forecasting using Stochastic Differential
Equations
- URL: http://arxiv.org/abs/2103.15041v1
- Date: Sun, 28 Mar 2021 04:18:11 GMT
- Title: Accurate and Reliable Forecasting using Stochastic Differential
Equations
- Authors: Peng Cui, Zhijie Deng, Wenbo Hu and Jun Zhu
- Abstract summary: It is critical yet challenging for deep learning models to properly characterize uncertainty that is pervasive in real-world environments.
This paper develops SDE-HNN to characterize the interaction between the predictive mean and variance of HNNs for accurate and reliable regression.
Experiments on the challenging datasets show that our method significantly outperforms the state-of-the-art baselines in terms of both predictive performance and uncertainty quantification.
- Score: 48.21369419647511
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: It is critical yet challenging for deep learning models to properly
characterize uncertainty that is pervasive in real-world environments. Although
a lot of efforts have been made, such as heteroscedastic neural networks
(HNNs), little work has demonstrated satisfactory practicability due to the
different levels of compromise on learning efficiency, quality of uncertainty
estimates, and predictive performance. Moreover, existing HNNs typically fail
to construct an explicit interaction between the prediction and its associated
uncertainty. This paper aims to remedy these issues by developing SDE-HNN, a
new heteroscedastic neural network equipped with stochastic differential
equations (SDE) to characterize the interaction between the predictive mean and
variance of HNNs for accurate and reliable regression. Theoretically, we show
the existence and uniqueness of the solution to the devised neural SDE.
Moreover, based on the bias-variance trade-off for the optimization in SDE-HNN,
we design an enhanced numerical SDE solver to improve the learning stability.
Finally, to more systematically evaluate the predictive uncertainty, we present
two new diagnostic uncertainty metrics. Experiments on the challenging datasets
show that our method significantly outperforms the state-of-the-art baselines
in terms of both predictive performance and uncertainty quantification,
delivering well-calibrated and sharp prediction intervals.
Related papers
- Adversarial Learning for Neural PDE Solvers with Sparse Data [4.226449585713182]
This study introduces a universal learning strategy for neural network PDEs, named Systematic Model Augmentation for Robust Training.
By focusing on challenging and improving the model's weaknesses, SMART reduces generalization error during training under data-scarce conditions.
arXiv Detail & Related papers (2024-09-04T04:18:25Z) - Uncertainty Calibration with Energy Based Instance-wise Scaling in the Wild Dataset [23.155946032377052]
We introduce a novel instance-wise calibration method based on an energy model.
Our method incorporates energy scores instead of softmax confidence scores, allowing for adaptive consideration of uncertainty.
In experiments, we show that the proposed method consistently maintains robust performance across the spectrum.
arXiv Detail & Related papers (2024-07-17T06:14:55Z) - Uncertainty in Graph Neural Networks: A Survey [50.63474656037679]
Graph Neural Networks (GNNs) have been extensively used in various real-world applications.
However, the predictive uncertainty of GNNs stemming from diverse sources can lead to unstable and erroneous predictions.
This survey aims to provide a comprehensive overview of the GNNs from the perspective of uncertainty.
arXiv Detail & Related papers (2024-03-11T21:54:52Z) - Stable Neural Stochastic Differential Equations in Analyzing Irregular Time Series Data [3.686808512438363]
Irregular sampling intervals and missing values in real-world time series data present challenges for conventional methods.
We propose three stable classes of Neural SDEs: Langevin-type SDE, Linear Noise SDE, and Geometric SDE.
Our results demonstrate the efficacy of the proposed method in handling real-world irregular time series data.
arXiv Detail & Related papers (2024-02-22T22:00:03Z) - Uncertainty Quantification for Forward and Inverse Problems of PDEs via
Latent Global Evolution [110.99891169486366]
We propose a method that integrates efficient and precise uncertainty quantification into a deep learning-based surrogate model.
Our method endows deep learning-based surrogate models with robust and efficient uncertainty quantification capabilities for both forward and inverse problems.
Our method excels at propagating uncertainty over extended auto-regressive rollouts, making it suitable for scenarios involving long-term predictions.
arXiv Detail & Related papers (2024-02-13T11:22:59Z) - DiffHybrid-UQ: Uncertainty Quantification for Differentiable Hybrid
Neural Modeling [4.76185521514135]
We introduce a novel method, DiffHybrid-UQ, for effective and efficient uncertainty propagation and estimation in hybrid neural differentiable models.
Specifically, our approach effectively discerns and quantifies both aleatoric uncertainties, arising from data noise, and epistemic uncertainties, resulting from model-form discrepancies and data sparsity.
arXiv Detail & Related papers (2023-12-30T07:40:47Z) - Variational Voxel Pseudo Image Tracking [127.46919555100543]
Uncertainty estimation is an important task for critical problems, such as robotics and autonomous driving.
We propose a Variational Neural Network-based version of a Voxel Pseudo Image Tracking (VPIT) method for 3D Single Object Tracking.
arXiv Detail & Related papers (2023-02-12T13:34:50Z) - The Unreasonable Effectiveness of Deep Evidential Regression [72.30888739450343]
A new approach with uncertainty-aware regression-based neural networks (NNs) shows promise over traditional deterministic methods and typical Bayesian NNs.
We detail the theoretical shortcomings and analyze the performance on synthetic and real-world data sets, showing that Deep Evidential Regression is a quantification rather than an exact uncertainty.
arXiv Detail & Related papers (2022-05-20T10:10:32Z) - SDE-Net: Equipping Deep Neural Networks with Uncertainty Estimates [45.43024126674237]
Uncertainty quantification is a fundamental yet unsolved problem for deep learning.
Bayesian framework provides principled way of uncertainty estimation but is often not scalable to modern deep neural nets (DNNs)
We propose a new method for quantifying uncertainties of DNNs from a dynamical system perspective.
arXiv Detail & Related papers (2020-08-24T16:33:54Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.