On Monocular Depth Estimation and Uncertainty Quantification using
Classification Approaches for Regression
- URL: http://arxiv.org/abs/2202.12369v1
- Date: Thu, 24 Feb 2022 21:40:51 GMT
- Title: On Monocular Depth Estimation and Uncertainty Quantification using
Classification Approaches for Regression
- Authors: Xuanlong Yu, Gianni Franchi, Emanuel Aldea
- Abstract summary: This paper introduces a taxonomy and summary of Classification Approaches for Regression approaches.
It also introduces a new uncertainty estimation solution for CAR.
Experiments reflect the differences in the portability of various CAR methods on two backbones.
- Score: 2.784501414201992
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Monocular depth is important in many tasks, such as 3D reconstruction and
autonomous driving. Deep learning based models achieve state-of-the-art
performance in this field. A set of novel approaches for estimating monocular
depth consists of transforming the regression task into a classification one.
However, there is a lack of detailed descriptions and comparisons for
Classification Approaches for Regression (CAR) in the community and no in-depth
exploration of their potential for uncertainty estimation. To this end, this
paper will introduce a taxonomy and summary of CAR approaches, a new
uncertainty estimation solution for CAR, and a set of experiments on depth
accuracy and uncertainty quantification for CAR-based models on KITTI dataset.
The experiments reflect the differences in the portability of various CAR
methods on two backbones. Meanwhile, the newly proposed method for uncertainty
estimation can outperform the ensembling method with only one forward
propagation.
Related papers
- MetricDepth: Enhancing Monocular Depth Estimation with Deep Metric Learning [46.57327530703435]
In monocular depth estimation, the absence of a natural definition of class poses challenges in the leveraging of deep metric learning.
This paper introduces MetricDepth, a novel method that integrates deep metric learning to enhance the performance of monocular depth estimation.
Experiments across various datasets and model types demonstrate the effectiveness and versatility of MetricDepth.
arXiv Detail & Related papers (2024-12-29T07:57:12Z) - A Neighbor-Searching Discrepancy-based Drift Detection Scheme for Learning Evolving Data [40.00357483768265]
This work presents a novel real concept drift detection method based on Neighbor-Searching Discrepancy.
The proposed method is able to detect real concept drift with high accuracy while ignoring virtual drift.
It can also indicate the direction of the classification boundary change by identifying the invasion or retreat of a certain class.
arXiv Detail & Related papers (2024-05-23T04:03:36Z) - Consensus-Adaptive RANSAC [104.87576373187426]
We propose a new RANSAC framework that learns to explore the parameter space by considering the residuals seen so far via a novel attention layer.
The attention mechanism operates on a batch of point-to-model residuals, and updates a per-point estimation state to take into account the consensus found through a lightweight one-step transformer.
arXiv Detail & Related papers (2023-07-26T08:25:46Z) - Uncertainty Estimation by Fisher Information-based Evidential Deep
Learning [61.94125052118442]
Uncertainty estimation is a key factor that makes deep learning reliable in practical applications.
We propose a novel method, Fisher Information-based Evidential Deep Learning ($mathcalI$-EDL)
In particular, we introduce Fisher Information Matrix (FIM) to measure the informativeness of evidence carried by each sample, according to which we can dynamically reweight the objective loss terms to make the network more focused on the representation learning of uncertain classes.
arXiv Detail & Related papers (2023-03-03T16:12:59Z) - BayesCap: Bayesian Identity Cap for Calibrated Uncertainty in Frozen
Neural Networks [50.15201777970128]
We propose BayesCap that learns a Bayesian identity mapping for the frozen model, allowing uncertainty estimation.
BayesCap is a memory-efficient method that can be trained on a small fraction of the original dataset.
We show the efficacy of our method on a wide variety of tasks with a diverse set of architectures.
arXiv Detail & Related papers (2022-07-14T12:50:09Z) - Diffusion Tensor Estimation with Uncertainty Calibration [6.5085381751712506]
We propose a deep learning method to estimate the diffusion tensor and compute the estimation uncertainty.
Data-dependent uncertainty is computed directly by the network and learned via loss attenuation.
We show that the estimation uncertainties computed by the new method can highlight the model's biases, detect domain shift, and reflect the strength of noise in the measurements.
arXiv Detail & Related papers (2021-11-21T15:58:01Z) - Regularizing Variational Autoencoder with Diversity and Uncertainty
Awareness [61.827054365139645]
Variational Autoencoder (VAE) approximates the posterior of latent variables based on amortized variational inference.
We propose an alternative model, DU-VAE, for learning a more Diverse and less Uncertain latent space.
arXiv Detail & Related papers (2021-10-24T07:58:13Z) - A Variational Bayesian Approach to Learning Latent Variables for
Acoustic Knowledge Transfer [55.20627066525205]
We propose a variational Bayesian (VB) approach to learning distributions of latent variables in deep neural network (DNN) models.
Our proposed VB approach can obtain good improvements on target devices, and consistently outperforms 13 state-of-the-art knowledge transfer algorithms.
arXiv Detail & Related papers (2021-10-16T15:54:01Z) - The Aleatoric Uncertainty Estimation Using a Separate Formulation with
Virtual Residuals [51.71066839337174]
Existing methods can quantify the error in the target estimation, but they tend to underestimate it.
We propose a new separable formulation for the estimation of a signal and of its uncertainty, avoiding the effect of overfitting.
We demonstrate that the proposed method outperforms a state-of-the-art technique for signal and uncertainty estimation.
arXiv Detail & Related papers (2020-11-03T12:11:27Z) - Dropout Strikes Back: Improved Uncertainty Estimation via Diversity
Sampling [3.077929914199468]
We show that modifying the sampling distributions for dropout layers in neural networks improves the quality of uncertainty estimation.
Our main idea consists of two main steps: computing data-driven correlations between neurons and generating samples, which include maximally diverse neurons.
arXiv Detail & Related papers (2020-03-06T15:20:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.