Uncertainty Estimation with Deep Learning for Rainfall-Runoff Modelling
- URL: http://arxiv.org/abs/2012.14295v1
- Date: Tue, 15 Dec 2020 20:52:19 GMT
- Title: Uncertainty Estimation with Deep Learning for Rainfall-Runoff Modelling
- Authors: Daniel Klotz, Frederik Kratzert, Martin Gauch, Alden Keefe Sampson,
G\"unter Klambauer, Sepp Hochreiter, Grey Nearing
- Abstract summary: Uncertainty estimations are critical for actionable hydrological forecasting.
We show that accurate, precise, and reliable uncertainty estimation can be achieved with Deep Learning.
- Score: 4.080450230687111
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Deep Learning is becoming an increasingly important way to produce accurate
hydrological predictions across a wide range of spatial and temporal scales.
Uncertainty estimations are critical for actionable hydrological forecasting,
and while standardized community benchmarks are becoming an increasingly
important part of hydrological model development and research, similar tools
for benchmarking uncertainty estimation are lacking. We establish an
uncertainty estimation benchmarking procedure and present four Deep Learning
baselines, out of which three are based on Mixture Density Networks and one is
based on Monte Carlo dropout. Additionally, we provide a post-hoc model
analysis to put forward some qualitative understanding of the resulting models.
Most importantly however, we show that accurate, precise, and reliable
uncertainty estimation can be achieved with Deep Learning.
Related papers
- A Critical Synthesis of Uncertainty Quantification and Foundation Models in Monocular Depth Estimation [13.062551984263031]
Metric depth estimation, which involves predicting absolute distances, poses particular challenges.
We fuse five different uncertainty quantification methods with the current state-of-the-art DepthAnythingV2 foundation model.
Our findings identify fine-tuning with the Gaussian Negative Log-Likelihood Loss (GNLL) as a particularly promising approach.
arXiv Detail & Related papers (2025-01-14T15:13:00Z) - Deep Modeling of Non-Gaussian Aleatoric Uncertainty [4.969887562291159]
Deep learning offers promising new ways to accurately model aleatoric uncertainty in robotic estimation systems.
In this study, we formulate and evaluate three fundamental deep learning approaches for conditional probability density modeling.
Our results show that these deep learning methods can accurately capture complex uncertainty patterns, highlighting their potential for improving the reliability and robustness of estimation systems.
arXiv Detail & Related papers (2024-05-30T22:13:17Z) - Evidential Deep Learning: Enhancing Predictive Uncertainty Estimation
for Earth System Science Applications [0.32302664881848275]
Evidential deep learning is a technique that extends parametric deep learning to higher-order distributions.
This study compares the uncertainty derived from evidential neural networks to those obtained from ensembles.
We show evidential deep learning models attaining predictive accuracy rivaling standard methods, while robustly quantifying both sources of uncertainty.
arXiv Detail & Related papers (2023-09-22T23:04:51Z) - Measuring and Modeling Uncertainty Degree for Monocular Depth Estimation [50.920911532133154]
The intrinsic ill-posedness and ordinal-sensitive nature of monocular depth estimation (MDE) models pose major challenges to the estimation of uncertainty degree.
We propose to model the uncertainty of MDE models from the perspective of the inherent probability distributions.
By simply introducing additional training regularization terms, our model, with surprisingly simple formations and without requiring extra modules or multiple inferences, can provide uncertainty estimations with state-of-the-art reliability.
arXiv Detail & Related papers (2023-07-19T12:11:15Z) - The Implicit Delta Method [61.36121543728134]
In this paper, we propose an alternative, the implicit delta method, which works by infinitesimally regularizing the training loss of uncertainty.
We show that the change in the evaluation due to regularization is consistent for the variance of the evaluation estimator, even when the infinitesimal change is approximated by a finite difference.
arXiv Detail & Related papers (2022-11-11T19:34:17Z) - Diffusion Tensor Estimation with Uncertainty Calibration [6.5085381751712506]
We propose a deep learning method to estimate the diffusion tensor and compute the estimation uncertainty.
Data-dependent uncertainty is computed directly by the network and learned via loss attenuation.
We show that the estimation uncertainties computed by the new method can highlight the model's biases, detect domain shift, and reflect the strength of noise in the measurements.
arXiv Detail & Related papers (2021-11-21T15:58:01Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Do Not Forget to Attend to Uncertainty while Mitigating Catastrophic
Forgetting [29.196246255389664]
One of the major limitations of deep learning models is that they face catastrophic forgetting in an incremental learning scenario.
We consider a Bayesian formulation to obtain the data and model uncertainties.
We also incorporate self-attention framework to address the incremental learning problem.
arXiv Detail & Related papers (2021-02-03T06:54:52Z) - Learning Accurate Dense Correspondences and When to Trust Them [161.76275845530964]
We aim to estimate a dense flow field relating two images, coupled with a robust pixel-wise confidence map.
We develop a flexible probabilistic approach that jointly learns the flow prediction and its uncertainty.
Our approach obtains state-of-the-art results on challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-01-05T18:54:11Z) - Discriminative Jackknife: Quantifying Uncertainty in Deep Learning via
Higher-Order Influence Functions [121.10450359856242]
We develop a frequentist procedure that utilizes influence functions of a model's loss functional to construct a jackknife (or leave-one-out) estimator of predictive confidence intervals.
The DJ satisfies (1) and (2), is applicable to a wide range of deep learning models, is easy to implement, and can be applied in a post-hoc fashion without interfering with model training or compromising its accuracy.
arXiv Detail & Related papers (2020-06-29T13:36:52Z) - On the uncertainty of self-supervised monocular depth estimation [52.13311094743952]
Self-supervised paradigms for monocular depth estimation are very appealing since they do not require ground truth annotations at all.
We explore for the first time how to estimate the uncertainty for this task and how this affects depth accuracy.
We propose a novel peculiar technique specifically designed for self-supervised approaches.
arXiv Detail & Related papers (2020-05-13T09:00:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.