Towards Robust Probabilistic Modeling on SO(3) via Rotation Laplace
Distribution
- URL: http://arxiv.org/abs/2305.10465v1
- Date: Wed, 17 May 2023 12:31:48 GMT
- Title: Towards Robust Probabilistic Modeling on SO(3) via Rotation Laplace
Distribution
- Authors: Yingda Yin, Jiangran Lyu, Yang Wang, He Wang, Baoquan Chen
- Abstract summary: Estimating the 3DoF rotation from a single RGB image is a challenging problem.
In this paper, we propose a novel rotation Laplace distribution on SO(3).
Our method is robust to the disturbance of outliers and enforces much gradient to the low-error region that it can improve.
- Score: 32.26083557492705
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Estimating the 3DoF rotation from a single RGB image is an important yet
challenging problem. As a popular approach, probabilistic rotation modeling
additionally carries prediction uncertainty information, compared to
single-prediction rotation regression. For modeling probabilistic distribution
over SO(3), it is natural to use Gaussian-like Bingham distribution and matrix
Fisher, however they are shown to be sensitive to outlier predictions, e.g.
$180^\circ$ error and thus are unlikely to converge with optimal performance.
In this paper, we draw inspiration from multivariate Laplace distribution and
propose a novel rotation Laplace distribution on SO(3). Our rotation Laplace
distribution is robust to the disturbance of outliers and enforces much
gradient to the low-error region that it can improve. In addition, we show that
our method also exhibits robustness to small noises and thus tolerates
imperfect annotations. With this benefit, we demonstrate its advantages in
semi-supervised rotation regression, where the pseudo labels are noisy. To
further capture the multi-modal rotation solution space for symmetric objects,
we extend our distribution to rotation Laplace mixture model and demonstrate
its effectiveness. Our extensive experiments show that our proposed
distribution and the mixture model achieve state-of-the-art performance in all
the rotation regression experiments over both probabilistic and
non-probabilistic baselines.
Related papers
- A Stein Gradient Descent Approach for Doubly Intractable Distributions [5.63014864822787]
We propose a novel Monte Carlo Stein variational gradient descent (MC-SVGD) approach for inference for doubly intractable distributions.
The proposed method achieves substantial computational gains over existing algorithms, while providing comparable inferential performance for the posterior distributions.
arXiv Detail & Related papers (2024-10-28T13:42:27Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Learning Gaussian Representation for Eye Fixation Prediction [54.88001757991433]
Existing eye fixation prediction methods perform the mapping from input images to the corresponding dense fixation maps generated from raw fixation points.
We introduce Gaussian Representation for eye fixation modeling.
We design our framework upon some lightweight backbones to achieve real-time fixation prediction.
arXiv Detail & Related papers (2024-03-21T20:28:22Z) - Delving into Discrete Normalizing Flows on SO(3) Manifold for
Probabilistic Rotation Modeling [30.09829541716024]
We propose a novel normalizing flow on SO(3) manifold.
We show that our rotation normalizing flows significantly outperform the baselines on both unconditional and conditional tasks.
arXiv Detail & Related papers (2023-04-08T06:52:02Z) - A Laplace-inspired Distribution on SO(3) for Probabilistic Rotation
Estimation [35.242645262982045]
Estimating the 3DoF rotation from a single RGB image is an important yet challenging problem.
We propose a novel Rotation Laplace distribution on SO(3).
Our experiments show that our proposed distribution achieves state-of-the-art performance for rotation regression tasks.
arXiv Detail & Related papers (2023-03-03T07:10:02Z) - Robust Gaussian Process Regression with Huber Likelihood [2.7184224088243365]
We propose a robust process model in the Gaussian process framework with the likelihood of observed data expressed as the Huber probability distribution.
The proposed model employs weights based on projection statistics to scale residuals and bound the influence of vertical outliers and bad leverage points on the latent functions estimates.
arXiv Detail & Related papers (2023-01-19T02:59:33Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Bayesian Deep Learning and a Probabilistic Perspective of Generalization [56.69671152009899]
We show that deep ensembles provide an effective mechanism for approximate Bayesian marginalization.
We also propose a related approach that further improves the predictive distribution by marginalizing within basins of attraction.
arXiv Detail & Related papers (2020-02-20T15:13:27Z) - Distributionally Robust Bayesian Quadrature Optimization [60.383252534861136]
We study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d. samples.
A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set.
We propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose.
arXiv Detail & Related papers (2020-01-19T12:00:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.