A Laplace-inspired Distribution on SO(3) for Probabilistic Rotation
Estimation
- URL: http://arxiv.org/abs/2303.01743v1
- Date: Fri, 3 Mar 2023 07:10:02 GMT
- Title: A Laplace-inspired Distribution on SO(3) for Probabilistic Rotation
Estimation
- Authors: Yingda Yin, Yang Wang, He Wang, Baoquan Chen
- Abstract summary: Estimating the 3DoF rotation from a single RGB image is an important yet challenging problem.
We propose a novel Rotation Laplace distribution on SO(3).
Our experiments show that our proposed distribution achieves state-of-the-art performance for rotation regression tasks.
- Score: 35.242645262982045
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Estimating the 3DoF rotation from a single RGB image is an important yet
challenging problem. Probabilistic rotation regression has raised more and more
attention with the benefit of expressing uncertainty information along with the
prediction. Though modeling noise using Gaussian-resembling Bingham
distribution and matrix Fisher distribution is natural, they are shown to be
sensitive to outliers for the nature of quadratic punishment to deviations. In
this paper, we draw inspiration from multivariate Laplace distribution and
propose a novel Rotation Laplace distribution on SO(3). Rotation Laplace
distribution is robust to the disturbance of outliers and enforces much
gradient to the low-error region, resulting in a better convergence. Our
extensive experiments show that our proposed distribution achieves
state-of-the-art performance for rotation regression tasks over both
probabilistic and non-probabilistic baselines. Our project page is at
https://pku-epic.github.io/RotationLaplace.
Related papers
- A Stein Gradient Descent Approach for Doubly Intractable Distributions [5.63014864822787]
We propose a novel Monte Carlo Stein variational gradient descent (MC-SVGD) approach for inference for doubly intractable distributions.
The proposed method achieves substantial computational gains over existing algorithms, while providing comparable inferential performance for the posterior distributions.
arXiv Detail & Related papers (2024-10-28T13:42:27Z) - Learning Gaussian Representation for Eye Fixation Prediction [54.88001757991433]
Existing eye fixation prediction methods perform the mapping from input images to the corresponding dense fixation maps generated from raw fixation points.
We introduce Gaussian Representation for eye fixation modeling.
We design our framework upon some lightweight backbones to achieve real-time fixation prediction.
arXiv Detail & Related papers (2024-03-21T20:28:22Z) - Towards Robust Probabilistic Modeling on SO(3) via Rotation Laplace
Distribution [32.26083557492705]
Estimating the 3DoF rotation from a single RGB image is a challenging problem.
In this paper, we propose a novel rotation Laplace distribution on SO(3).
Our method is robust to the disturbance of outliers and enforces much gradient to the low-error region that it can improve.
arXiv Detail & Related papers (2023-05-17T12:31:48Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - Probabilistic Rotation Representation With an Efficiently Computable
Bingham Loss Function and Its Application to Pose Estimation [0.0]
We propose a fast-computable and easy-to-implement loss function for Bingham distribution.
We also show not only to examine the parametrization of Bingham distribution but also an application based on our loss function.
arXiv Detail & Related papers (2022-03-09T00:38:28Z) - Robust Estimation for Nonparametric Families via Generative Adversarial
Networks [92.64483100338724]
We provide a framework for designing Generative Adversarial Networks (GANs) to solve high dimensional robust statistics problems.
Our work extend these to robust mean estimation, second moment estimation, and robust linear regression.
In terms of techniques, our proposed GAN losses can be viewed as a smoothed and generalized Kolmogorov-Smirnov distance.
arXiv Detail & Related papers (2022-02-02T20:11:33Z) - Distributed Random Reshuffling over Networks [7.013052033764372]
A distributed resh-upr (D-RR) algorithm is proposed to solve the problem of convex and smooth objective functions.
In particular, for smooth convex objective functions, D-RR achieves D-T convergence rate (where $T counts epoch number) in terms of distance between the global drives.
arXiv Detail & Related papers (2021-12-31T03:59:37Z) - Uncertainty Inspired RGB-D Saliency Detection [70.50583438784571]
We propose the first framework to employ uncertainty for RGB-D saliency detection by learning from the data labeling process.
Inspired by the saliency data labeling process, we propose a generative architecture to achieve probabilistic RGB-D saliency detection.
Results on six challenging RGB-D benchmark datasets show our approach's superior performance in learning the distribution of saliency maps.
arXiv Detail & Related papers (2020-09-07T13:01:45Z) - Probabilistic orientation estimation with matrix Fisher distributions [0.0]
This paper focuses on estimating probability distributions over the set of 3D rotations using deep neural networks.
Learning to regress models to the set of rotations is inherently difficult due to differences in topology.
We overcome this issue by using a neural network to output the parameters for a matrix Fisher distribution.
arXiv Detail & Related papers (2020-06-17T09:28:19Z) - Distributionally Robust Bayesian Quadrature Optimization [60.383252534861136]
We study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d. samples.
A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set.
We propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose.
arXiv Detail & Related papers (2020-01-19T12:00:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.