Distribution Preserving Multiple Hypotheses Prediction for Uncertainty
Modeling
- URL: http://arxiv.org/abs/2110.02858v1
- Date: Wed, 6 Oct 2021 15:36:17 GMT
- Title: Distribution Preserving Multiple Hypotheses Prediction for Uncertainty
Modeling
- Authors: Tobias Leemann, Moritz Sackmann, J\"orn Thielecke, Ulrich Hofmann
- Abstract summary: We propose an alternative loss for preserving the Multiple Hypotheses Prediction approach.
We empirically show that our approach yields more representative hypotheses on a synthetic and a real-world motion prediction data set.
The outputs of the proposed method can directly be used in sampling-based Monte-Carlo methods.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many supervised machine learning tasks, such as future state prediction in
dynamical systems, require precise modeling of a forecast's uncertainty. The
Multiple Hypotheses Prediction (MHP) approach addresses this problem by
providing several hypotheses that represent possible outcomes. Unfortunately,
with the common $l_2$ loss function, these hypotheses do not preserve the data
distribution's characteristics. We propose an alternative loss for distribution
preserving MHP and review relevant theorems supporting our claims. Furthermore,
we empirically show that our approach yields more representative hypotheses on
a synthetic and a real-world motion prediction data set. The outputs of the
proposed method can directly be used in sampling-based Monte-Carlo methods.
Related papers
- Posterior Uncertainty Quantification in Neural Networks using Data Augmentation [3.9860047080844807]
We show that deep ensembling is a fundamentally mis-specified model class, since it assumes that future data are supported on existing observations only.
We propose MixupMP, a method that constructs a more realistic predictive distribution using popular data augmentation techniques.
Our empirical analysis showcases that MixupMP achieves superior predictive performance and uncertainty quantification on various image classification datasets.
arXiv Detail & Related papers (2024-03-18T17:46:07Z) - Source-Free Unsupervised Domain Adaptation with Hypothesis Consolidation
of Prediction Rationale [53.152460508207184]
Source-Free Unsupervised Domain Adaptation (SFUDA) is a challenging task where a model needs to be adapted to a new domain without access to target domain labels or source domain data.
This paper proposes a novel approach that considers multiple prediction hypotheses for each sample and investigates the rationale behind each hypothesis.
To achieve the optimal performance, we propose a three-step adaptation process: model pre-adaptation, hypothesis consolidation, and semi-supervised learning.
arXiv Detail & Related papers (2024-02-02T05:53:22Z) - Introducing an Improved Information-Theoretic Measure of Predictive
Uncertainty [6.3398383724486544]
Predictive uncertainty is commonly measured by the entropy of the Bayesian model average (BMA) predictive distribution.
We introduce a theoretically grounded measure to overcome these limitations.
We find that our introduced measure behaves more reasonably in controlled synthetic tasks.
arXiv Detail & Related papers (2023-11-14T16:55:12Z) - Invariant Probabilistic Prediction [45.90606906307022]
We show that arbitrary distribution shifts do not, in general, admit invariant and robust probabilistic predictions.
We propose a method to yield invariant probabilistic predictions, called IPP, and study the consistency of the underlying parameters.
arXiv Detail & Related papers (2023-09-18T18:50:24Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Uncertainty estimation of pedestrian future trajectory using Bayesian
approximation [137.00426219455116]
Under dynamic traffic scenarios, planning based on deterministic predictions is not trustworthy.
The authors propose to quantify uncertainty during forecasting using approximation which deterministic approaches fail to capture.
The effect of dropout weights and long-term prediction on future state uncertainty has been studied.
arXiv Detail & Related papers (2022-05-04T04:23:38Z) - Evaluation of Machine Learning Techniques for Forecast Uncertainty
Quantification [0.13999481573773068]
Ensemble forecasting is, so far, the most successful approach to produce relevant forecasts along with an estimation of their uncertainty.
Main limitations of ensemble forecasting are the high computational cost and the difficulty to capture and quantify different sources of uncertainty.
In this work proof-of-concept model experiments are conducted to examine the performance of ANNs trained to predict a corrected state of the system and the state uncertainty using only a single deterministic forecast as input.
arXiv Detail & Related papers (2021-11-29T16:52:17Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z) - Test-time Collective Prediction [73.74982509510961]
Multiple parties in machine learning want to jointly make predictions on future test points.
Agents wish to benefit from the collective expertise of the full set of agents, but may not be willing to release their data or model parameters.
We explore a decentralized mechanism to make collective predictions at test time, leveraging each agent's pre-trained model.
arXiv Detail & Related papers (2021-06-22T18:29:58Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.