Multiview point cloud registration with anisotropic and space-varying
localization noise
- URL: http://arxiv.org/abs/2201.00708v1
- Date: Mon, 3 Jan 2022 15:21:24 GMT
- Title: Multiview point cloud registration with anisotropic and space-varying
localization noise
- Authors: Denis Fortun, Etienne Baudrier, Fabian Zwettler, Markus Sauer and
Sylvain Faisan
- Abstract summary: We address the problem of registering multiple point clouds corrupted with high anisotropic localization noise.
Existing methods are based on an implicit assumption of space-invariant isotropic noise.
We show that our noise handling strategy improves significantly the robustness to high levels of anisotropic noise.
- Score: 1.5499426028105903
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we address the problem of registering multiple point clouds
corrupted with high anisotropic localization noise. Our approach follows the
widely used framework of Gaussian mixture model (GMM) reconstruction with an
expectation-maximization (EM) algorithm. Existing methods are based on an
implicit assumption of space-invariant isotropic Gaussian noise. However, this
assumption is violated in practice in applications such as single molecule
localization microscopy (SMLM). To address this issue, we propose to introduce
an explicit localization noise model that decouples shape modeling with the GMM
from noise handling. We design a stochastic EM algorithm that considers
noise-free data as a latent variable, with closed-form solutions at each EM
step. The first advantage of our approach is to handle space-variant and
anisotropic Gaussian noise with arbitrary covariances. The second advantage is
to leverage the explicit noise model to impose prior knowledge about the noise
that may be available from physical sensors. We show on various simulated data
that our noise handling strategy improves significantly the robustness to high
levels of anisotropic noise. We also demonstrate the performance of our method
on real SMLM data.
Related papers
- Diffusion Gaussian Mixture Audio Denoise [23.760755498636943]
We propose a DiffGMM model, a denoising model based on the diffusion and Gaussian mixture models.
Given a noisy audio signal, we first apply a 1D-U-Net to extract features and train linear layers to estimate parameters for the Gaussian mixture model.
The noisy signal is continuously subtracted from the estimated noise to output clean audio signals.
arXiv Detail & Related papers (2024-06-13T14:18:10Z) - Bayesian Inference of General Noise Model Parameters from Surface Code's Syndrome Statistics [0.0]
We propose general noise model Bayesian inference methods that integrate the surface code's tensor network simulator.
For stationary noise, where the noise parameters are constant and do not change, we propose a method based on the Markov chain Monte Carlo.
For time-varying noise, which is a more realistic situation, we introduce another method based on the sequential Monte Carlo.
arXiv Detail & Related papers (2024-06-13T10:26:04Z) - Information limits and Thouless-Anderson-Palmer equations for spiked matrix models with structured noise [19.496063739638924]
We consider a saturate problem of Bayesian inference for a structured spiked model.
We show how to predict the statistical limits using an efficient algorithm inspired by the theory of adaptive Thouless-Anderson-Palmer equations.
arXiv Detail & Related papers (2024-05-31T16:38:35Z) - One Noise to Rule Them All: Learning a Unified Model of Spatially-Varying Noise Patterns [33.293193191683145]
We present a single generative model which can learn to generate multiple types of noise as well as blend between them.
We also present an application of our model to improving inverse procedural material design.
arXiv Detail & Related papers (2024-04-25T02:23:11Z) - Blue noise for diffusion models [50.99852321110366]
We introduce a novel and general class of diffusion models taking correlated noise within and across images into account.
Our framework allows introducing correlation across images within a single mini-batch to improve gradient flow.
We perform both qualitative and quantitative evaluations on a variety of datasets using our method.
arXiv Detail & Related papers (2024-02-07T14:59:25Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Adaptive Multi-View ICA: Estimation of noise levels for optimal
inference [65.94843987207445]
Adaptive multiView ICA (AVICA) is a noisy ICA model where each view is a linear mixture of shared independent sources with additive noise on the sources.
On synthetic data, AVICA yields better sources estimates than other group ICA methods thanks to its explicit MMSE estimator.
On real magnetoencephalograpy (MEG) data, we provide evidence that the decomposition is less sensitive to sampling noise and that the noise variance estimates are biologically plausible.
arXiv Detail & Related papers (2021-02-22T13:10:12Z) - Learning based signal detection for MIMO systems with unknown noise
statistics [84.02122699723536]
This paper aims to devise a generalized maximum likelihood (ML) estimator to robustly detect signals with unknown noise statistics.
In practice, there is little or even no statistical knowledge on the system noise, which in many cases is non-Gaussian, impulsive and not analyzable.
Our framework is driven by an unsupervised learning approach, where only the noise samples are required.
arXiv Detail & Related papers (2021-01-21T04:48:15Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Shape Matters: Understanding the Implicit Bias of the Noise Covariance [76.54300276636982]
Noise in gradient descent provides a crucial implicit regularization effect for training over parameterized models.
We show that parameter-dependent noise -- induced by mini-batches or label perturbation -- is far more effective than Gaussian noise.
Our analysis reveals that parameter-dependent noise introduces a bias towards local minima with smaller noise variance, whereas spherical Gaussian noise does not.
arXiv Detail & Related papers (2020-06-15T18:31:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.