Deep Bayesian ICP Covariance Estimation
- URL: http://arxiv.org/abs/2202.11607v1
- Date: Wed, 23 Feb 2022 16:42:04 GMT
- Title: Deep Bayesian ICP Covariance Estimation
- Authors: Andrea De Maio and Simon Lacroix
- Abstract summary: Iterative Closest Point (ICP) point cloud registration algorithm is essential for state estimation and sensor fusion purposes.
We argue that a major source of error for ICP is in the input data itself, from the sensor noise to the scene geometry.
Benefiting from recent developments in deep learning for point clouds, we propose a data-driven approach to learn an error model for ICP.
- Score: 3.5136071950790737
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Covariance estimation for the Iterative Closest Point (ICP) point cloud
registration algorithm is essential for state estimation and sensor fusion
purposes. We argue that a major source of error for ICP is in the input data
itself, from the sensor noise to the scene geometry. Benefiting from recent
developments in deep learning for point clouds, we propose a data-driven
approach to learn an error model for ICP. We estimate covariances modeling
data-dependent heteroscedastic aleatoric uncertainty, and epistemic uncertainty
using a variational Bayesian approach. The system evaluation is performed on
LiDAR odometry on different datasets, highlighting good results in comparison
to the state of the art.
Related papers
- Bayesian Estimation and Tuning-Free Rank Detection for Probability Mass Function Tensors [17.640500920466984]
This paper presents a novel framework for estimating the joint PMF and automatically inferring its rank from observed data.
We derive a deterministic solution based on variational inference (VI) to approximate the posterior distributions of various model parameters. Additionally, we develop a scalable version of the VI-based approach by leveraging variational inference (SVI)
Experiments involving both synthetic data and real movie recommendation data illustrate the advantages of our VI and SVI-based methods in terms of estimation accuracy, automatic rank detection, and computational efficiency.
arXiv Detail & Related papers (2024-10-08T20:07:49Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - Multi-Fidelity Covariance Estimation in the Log-Euclidean Geometry [0.0]
We introduce a multi-fidelity estimator of covariance matrices that employs the log-Euclidean geometry of the symmetric positive-definite manifold.
We develop an optimal sample allocation scheme that minimizes the mean-squared error of the estimator given a fixed budget.
Evaluations of our approach using data from physical applications demonstrate more accurate metric learning and speedups of more than one order of magnitude compared to benchmarks.
arXiv Detail & Related papers (2023-01-31T16:33:46Z) - TCDM: Transformational Complexity Based Distortion Metric for Perceptual
Point Cloud Quality Assessment [24.936061591860838]
The goal of objective point cloud quality assessment (PCQA) research is to develop metrics that measure point cloud quality in a consistent manner.
We evaluate the point cloud quality by measuring the complexity of transforming the distorted point cloud back to its reference.
The effectiveness of the proposed transformational complexity based distortion metric (TCDM) is evaluated through extensive experiments conducted on five public point cloud quality assessment databases.
arXiv Detail & Related papers (2022-10-10T13:20:51Z) - Posterior and Computational Uncertainty in Gaussian Processes [52.26904059556759]
Gaussian processes scale prohibitively with the size of the dataset.
Many approximation methods have been developed, which inevitably introduce approximation error.
This additional source of uncertainty, due to limited computation, is entirely ignored when using the approximate posterior.
We develop a new class of methods that provides consistent estimation of the combined uncertainty arising from both the finite number of data observed and the finite amount of computation expended.
arXiv Detail & Related papers (2022-05-30T22:16:25Z) - PDC-Net+: Enhanced Probabilistic Dense Correspondence Network [161.76275845530964]
Enhanced Probabilistic Dense Correspondence Network, PDC-Net+, capable of estimating accurate dense correspondences.
We develop an architecture and an enhanced training strategy tailored for robust and generalizable uncertainty prediction.
Our approach obtains state-of-the-art results on multiple challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-09-28T17:56:41Z) - Stein ICP for Uncertainty Estimation in Point Cloud Matching [41.22194677919566]
Quantification of uncertainty in point cloud matching is critical in many tasks such as pose estimation, sensor fusion, and grasping.
Iterative closest point (ICP) is a commonly used pose estimation algorithm which provides a point estimate of the transformation between two point clouds.
We propose a new algorithm to align two point clouds that can precisely estimate the uncertainty of ICP's transformation parameters.
arXiv Detail & Related papers (2021-06-07T01:07:34Z) - Graph Embedding with Data Uncertainty [113.39838145450007]
spectral-based subspace learning is a common data preprocessing step in many machine learning pipelines.
Most subspace learning methods do not take into consideration possible measurement inaccuracies or artifacts that can lead to data with high uncertainty.
arXiv Detail & Related papers (2020-09-01T15:08:23Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Uncertainty Estimation Using a Single Deep Deterministic Neural Network [66.26231423824089]
We propose a method for training a deterministic deep model that can find and reject out of distribution data points at test time with a single forward pass.
We scale training in these with a novel loss function and centroid updating scheme and match the accuracy of softmax models.
arXiv Detail & Related papers (2020-03-04T12:27:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.