Multi-Irreducible Spectral Synchronization for Robust Rotation Averaging
- URL: http://arxiv.org/abs/2311.16544v1
- Date: Tue, 28 Nov 2023 06:25:26 GMT
- Title: Multi-Irreducible Spectral Synchronization for Robust Rotation Averaging
- Authors: Owen Howell, Haoen Huang, and David Rosen
- Abstract summary: We show how to estimate a set of unknown orientations $R_1,..., R_N in SO$, given noisy measurements.
Results indicate how to devise measurement networks that are emph guaranteed to achieve accurate estimation.
- Score: 1.2289361708127877
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Rotation averaging (RA) is a fundamental problem in robotics and computer
vision. In RA, the goal is to estimate a set of $N$ unknown orientations
$R_{1}, ..., R_{N} \in SO(3)$, given noisy measurements $R_{ij} \sim R^{-1}_{i}
R_{j}$ of a subset of their pairwise relative rotations. This problem is both
nonconvex and NP-hard, and thus difficult to solve in the general case. We
apply harmonic analysis on compact groups to derive a (convex) spectral
relaxation constructed from truncated Fourier decompositions of the individual
summands appearing in the RA objective; we then recover an estimate of the RA
solution by computing a few extremal eigenpairs of this relaxation, and
(approximately) solving a consensus problem. Our approach affords several
notable advantages versus prior RA methods: it can be used in conjunction with
\emph{any} smooth loss function (including, but not limited to, robust
M-estimators), does not require any initialization, and is implemented using
only simple (and highly scalable) linear-algebraic computations and
parallelizable optimizations over band-limited functions of individual
rotational states. Moreover, under the (physically well-motivated) assumption
of multiplicative Langevin measurement noise, we derive explicit performance
guarantees for our spectral estimator (in the form of probabilistic tail bounds
on the estimation error) that are parameterized in terms of graph-theoretic
quantities of the underlying measurement network. By concretely linking
estimator performance with properties of the underlying measurement graph, our
results also indicate how to devise measurement networks that are
\emph{guaranteed} to achieve accurate estimation, enabling such downstream
tasks as sensor placement, network compression, and active sensing.
Related papers
- A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - A Mean-Field Analysis of Neural Stochastic Gradient Descent-Ascent for Functional Minimax Optimization [90.87444114491116]
This paper studies minimax optimization problems defined over infinite-dimensional function classes of overparametricized two-layer neural networks.
We address (i) the convergence of the gradient descent-ascent algorithm and (ii) the representation learning of the neural networks.
Results show that the feature representation induced by the neural networks is allowed to deviate from the initial one by the magnitude of $O(alpha-1)$, measured in terms of the Wasserstein distance.
arXiv Detail & Related papers (2024-04-18T16:46:08Z) - Stochastic Optimization for Non-convex Problem with Inexact Hessian
Matrix, Gradient, and Function [99.31457740916815]
Trust-region (TR) and adaptive regularization using cubics have proven to have some very appealing theoretical properties.
We show that TR and ARC methods can simultaneously provide inexact computations of the Hessian, gradient, and function values.
arXiv Detail & Related papers (2023-10-18T10:29:58Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - A coherence parameter characterizing generative compressed sensing with
Fourier measurements [8.106641866299379]
We prove the first known restricted isometry guarantee for generative compressed sensing with subsampled isometries.
Recovery efficacy is characterized by the coherence, a new parameter, which measures the interplay between the range of the network and the measurement matrix.
arXiv Detail & Related papers (2022-07-19T15:49:41Z) - Non-convex Distributionally Robust Optimization: Non-asymptotic Analysis [16.499651513178012]
Distributionally robust optimization (DRO) is a widely-used approach to learn models that are robust against distribution shift.
We provide non-asymptotic convergence guarantees even though the objective function is possibly prominent nonsmooth- and has normalized gradient descent.
arXiv Detail & Related papers (2021-10-24T14:56:38Z) - A Retrospective Approximation Approach for Smooth Stochastic
Optimization [0.2867517731896504]
Gradient (SG) is the defactorandom iterative technique to solve optimization (SO) problems with a smooth (non-fimation) objective $imation.
arXiv Detail & Related papers (2021-03-07T16:29:36Z) - Model-based multi-parameter mapping [0.0]
Quantitative MR imaging is increasingly favoured for its richer information content and standardised measures.
Estimations often assume noise subsets of data to solve for different quantities in isolation.
Instead, a generative model can be formulated and inverted to jointly recover parameter estimates.
arXiv Detail & Related papers (2021-02-02T17:00:11Z) - Stochastic Approximation for Online Tensorial Independent Component
Analysis [98.34292831923335]
Independent component analysis (ICA) has been a popular dimension reduction tool in statistical machine learning and signal processing.
In this paper, we present a by-product online tensorial algorithm that estimates for each independent component.
arXiv Detail & Related papers (2020-12-28T18:52:37Z) - Robust subgaussian estimation with VC-dimension [0.0]
This work proposes a new general way to bound the excess risk for MOM estimators.
The core technique is the use of VC-dimension (instead of Rademacher complexity) to measure the statistical complexity.
arXiv Detail & Related papers (2020-04-24T13:21:09Z) - Semiparametric Nonlinear Bipartite Graph Representation Learning with
Provable Guarantees [106.91654068632882]
We consider the bipartite graph and formalize its representation learning problem as a statistical estimation problem of parameters in a semiparametric exponential family distribution.
We show that the proposed objective is strongly convex in a neighborhood around the ground truth, so that a gradient descent-based method achieves linear convergence rate.
Our estimator is robust to any model misspecification within the exponential family, which is validated in extensive experiments.
arXiv Detail & Related papers (2020-03-02T16:40:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.