A New Distributional Ranking Loss With Uncertainty: Illustrated in
Relative Depth Estimation
- URL: http://arxiv.org/abs/2010.07091v1
- Date: Wed, 14 Oct 2020 13:47:18 GMT
- Title: A New Distributional Ranking Loss With Uncertainty: Illustrated in
Relative Depth Estimation
- Authors: Alican Mertan, Yusuf Huseyin Sahin, Damien Jade Duff, Gozde Unal
- Abstract summary: We propose a new approach for the problem of relative depth estimation from a single image.
Instead of directly regressing over depth scores, we formulate the problem as estimation of a probability distribution over depth.
To train our model, we propose a new ranking loss, Distributional Loss, which tries to increase the probability of farther pixel's depth being greater than the closer pixel's depth.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new approach for the problem of relative depth estimation from a
single image. Instead of directly regressing over depth scores, we formulate
the problem as estimation of a probability distribution over depth and aim to
learn the parameters of the distributions which maximize the likelihood of the
given data. To train our model, we propose a new ranking loss, Distributional
Loss, which tries to increase the probability of farther pixel's depth being
greater than the closer pixel's depth. Our proposed approach allows our model
to output confidence in its estimation in the form of standard deviation of the
distribution. We achieve state of the art results against a number of baselines
while providing confidence in our estimations. Our analysis show that estimated
confidence is actually a good indicator of accuracy. We investigate the usage
of confidence information in a downstream task of metric depth estimation, to
increase its performance.
Related papers
- Rejection via Learning Density Ratios [50.91522897152437]
Classification with rejection emerges as a learning paradigm which allows models to abstain from making predictions.
We propose a different distributional perspective, where we seek to find an idealized data distribution which maximizes a pretrained model's performance.
Our framework is tested empirically over clean and noisy datasets.
arXiv Detail & Related papers (2024-05-29T01:32:17Z) - MonoProb: Self-Supervised Monocular Depth Estimation with Interpretable
Uncertainty [4.260312058817663]
Self-supervised monocular depth estimation methods aim to be used in critical applications such as autonomous vehicles for environment analysis.
We propose MonoProb, a new unsupervised monocular depth estimation method that returns an interpretable uncertainty.
Our experiments highlight enhancements achieved by our method on standard depth and uncertainty metrics.
arXiv Detail & Related papers (2023-11-10T15:55:14Z) - Single Image Depth Prediction Made Better: A Multivariate Gaussian Take [163.14849753700682]
We introduce an approach that performs continuous modeling of per-pixel depth.
Our method's accuracy (named MG) is among the top on the KITTI depth-prediction benchmark leaderboard.
arXiv Detail & Related papers (2023-03-31T16:01:03Z) - Learning Confidence for Transformer-based Neural Machine Translation [38.679505127679846]
We propose an unsupervised confidence estimate learning jointly with the training of the neural machine translation (NMT) model.
We explain confidence as how many hints the NMT model needs to make a correct prediction, and more hints indicate low confidence.
We demonstrate that our learned confidence estimate achieves high accuracy on extensive sentence/word-level quality estimation tasks.
arXiv Detail & Related papers (2022-03-22T01:51:58Z) - Robust Depth Completion with Uncertainty-Driven Loss Functions [60.9237639890582]
We introduce uncertainty-driven loss functions to improve the robustness of depth completion and handle the uncertainty in depth completion.
Our method has been tested on KITTI Depth Completion Benchmark and achieved the state-of-the-art robustness performance in terms of MAE, IMAE, and IRMSE metrics.
arXiv Detail & Related papers (2021-12-15T05:22:34Z) - PDC-Net+: Enhanced Probabilistic Dense Correspondence Network [161.76275845530964]
Enhanced Probabilistic Dense Correspondence Network, PDC-Net+, capable of estimating accurate dense correspondences.
We develop an architecture and an enhanced training strategy tailored for robust and generalizable uncertainty prediction.
Our approach obtains state-of-the-art results on multiple challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-09-28T17:56:41Z) - Learning Accurate Dense Correspondences and When to Trust Them [161.76275845530964]
We aim to estimate a dense flow field relating two images, coupled with a robust pixel-wise confidence map.
We develop a flexible probabilistic approach that jointly learns the flow prediction and its uncertainty.
Our approach obtains state-of-the-art results on challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-01-05T18:54:11Z) - Variational Monocular Depth Estimation for Reliability Prediction [12.951621755732544]
Self-supervised learning for monocular depth estimation is widely investigated as an alternative to supervised learning approach.
Previous works have successfully improved the accuracy of depth estimation by modifying the model structure.
In this paper, we theoretically formulate a variational model for the monocular depth estimation to predict the reliability of the estimated depth image.
arXiv Detail & Related papers (2020-11-24T06:23:51Z) - Adaptive confidence thresholding for monocular depth estimation [83.06265443599521]
We propose a new approach to leverage pseudo ground truth depth maps of stereo images generated from self-supervised stereo matching methods.
The confidence map of the pseudo ground truth depth map is estimated to mitigate performance degeneration by inaccurate pseudo depth maps.
Experimental results demonstrate superior performance to state-of-the-art monocular depth estimation methods.
arXiv Detail & Related papers (2020-09-27T13:26:16Z) - PIVEN: A Deep Neural Network for Prediction Intervals with Specific
Value Prediction [14.635820704895034]
We present PIVEN, a deep neural network for producing both a PI and a value prediction.
Our approach makes no assumptions regarding data distribution within the PI, making its value prediction more effective for various real-world problems.
arXiv Detail & Related papers (2020-06-09T09:29:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.