Confidence-aware 3D Gaze Estimation and Evaluation Metric
- URL: http://arxiv.org/abs/2303.10062v1
- Date: Fri, 17 Mar 2023 15:44:44 GMT
- Title: Confidence-aware 3D Gaze Estimation and Evaluation Metric
- Authors: Qiaojie Zheng, Jiucai Zhang, Amy Zhang, Xiaoli Zhang
- Abstract summary: We introduce a confidence-aware model that predicts uncertainties together with gaze angle estimations.
We also introduce a novel effectiveness evaluation method based on the causality between eye feature degradation and the rise in inference uncertainty.
- Score: 15.852320764240995
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Deep learning appearance-based 3D gaze estimation is gaining popularity due
to its minimal hardware requirements and being free of constraint. Unreliable
and overconfident inferences, however, still limit the adoption of this gaze
estimation method. To address the unreliable and overconfident issues, we
introduce a confidence-aware model that predicts uncertainties together with
gaze angle estimations. We also introduce a novel effectiveness evaluation
method based on the causality between eye feature degradation and the rise in
inference uncertainty to assess the uncertainty estimation. Our
confidence-aware model demonstrates reliable uncertainty estimations while
providing angular estimation accuracies on par with the state-of-the-art.
Compared with the existing statistical uncertainty-angular-error evaluation
metric, the proposed effectiveness evaluation approach can more effectively
judge inferred uncertainties' performance at each prediction.
Related papers
- Revisiting Confidence Estimation: Towards Reliable Failure Prediction [53.79160907725975]
We find a general, widely existing but actually-neglected phenomenon that most confidence estimation methods are harmful for detecting misclassification errors.
We propose to enlarge the confidence gap by finding flat minima, which yields state-of-the-art failure prediction performance.
arXiv Detail & Related papers (2024-03-05T11:44:14Z) - Adaptive Uncertainty Estimation via High-Dimensional Testing on Latent
Representations [28.875819909902244]
Uncertainty estimation aims to evaluate the confidence of a trained deep neural network.
Existing uncertainty estimation approaches rely on low-dimensional distributional assumptions.
We propose a new framework using data-adaptive high-dimensional hypothesis testing for uncertainty estimation.
arXiv Detail & Related papers (2023-10-25T12:22:18Z) - Toward Reliable Human Pose Forecasting with Uncertainty [51.628234388046195]
We develop an open-source library for human pose forecasting, including multiple models, supporting several datasets.
We devise two types of uncertainty in the problem to increase performance and convey better trust.
arXiv Detail & Related papers (2023-04-13T17:56:08Z) - Gradient-based Uncertainty Attribution for Explainable Bayesian Deep
Learning [38.34033824352067]
Predictions made by deep learning models are prone to data perturbations, adversarial attacks, and out-of-distribution inputs.
We propose to develop explainable and actionable Bayesian deep learning methods to perform accurate uncertainty quantification.
arXiv Detail & Related papers (2023-04-10T19:14:15Z) - The Implicit Delta Method [61.36121543728134]
In this paper, we propose an alternative, the implicit delta method, which works by infinitesimally regularizing the training loss of uncertainty.
We show that the change in the evaluation due to regularization is consistent for the variance of the evaluation estimator, even when the infinitesimal change is approximated by a finite difference.
arXiv Detail & Related papers (2022-11-11T19:34:17Z) - Reliability-Aware Prediction via Uncertainty Learning for Person Image
Retrieval [51.83967175585896]
UAL aims at providing reliability-aware predictions by considering data uncertainty and model uncertainty simultaneously.
Data uncertainty captures the noise" inherent in the sample, while model uncertainty depicts the model's confidence in the sample's prediction.
arXiv Detail & Related papers (2022-10-24T17:53:20Z) - On Attacking Out-Domain Uncertainty Estimation in Deep Neural Networks [11.929914721626849]
We show that state-of-the-art uncertainty estimation algorithms could fail catastrophically under our proposed adversarial attack.
In particular, we aim at attacking the out-domain uncertainty estimation.
arXiv Detail & Related papers (2022-10-03T23:33:38Z) - DEUP: Direct Epistemic Uncertainty Prediction [56.087230230128185]
Epistemic uncertainty is part of out-of-sample prediction error due to the lack of knowledge of the learner.
We propose a principled approach for directly estimating epistemic uncertainty by learning to predict generalization error and subtracting an estimate of aleatoric uncertainty.
arXiv Detail & Related papers (2021-02-16T23:50:35Z) - The Aleatoric Uncertainty Estimation Using a Separate Formulation with
Virtual Residuals [51.71066839337174]
Existing methods can quantify the error in the target estimation, but they tend to underestimate it.
We propose a new separable formulation for the estimation of a signal and of its uncertainty, avoiding the effect of overfitting.
We demonstrate that the proposed method outperforms a state-of-the-art technique for signal and uncertainty estimation.
arXiv Detail & Related papers (2020-11-03T12:11:27Z) - Real-Time Uncertainty Estimation in Computer Vision via
Uncertainty-Aware Distribution Distillation [18.712408359052667]
We propose a simple, easy-to-optimize distillation method for learning the conditional predictive distribution of a pre-trained dropout model.
We empirically test the effectiveness of the proposed method on both semantic segmentation and depth estimation tasks.
arXiv Detail & Related papers (2020-07-31T05:40:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.