Rotation Coordinate Descent for Fast Globally Optimal Rotation Averaging
- URL: http://arxiv.org/abs/2103.08292v2
- Date: Tue, 16 Mar 2021 02:06:53 GMT
- Title: Rotation Coordinate Descent for Fast Globally Optimal Rotation Averaging
- Authors: \'Alvaro Parra, Shin-Fang Chng, Tat-Jun Chin, Anders Eriksson, Ian
Reid
- Abstract summary: We present a fast algorithm that achieves global optimality called rotation coordinate descent (RCD)
Unlike block coordinate descent (BCD) which solves SDP by updating the semidefinite matrix in a row-by-row fashion, RCD directly maintains and updates all valid rotations throughout the iterations.
We mathematically prove the convergence of our algorithm and empirically show its superior efficiency over state-of-the-art global methods.
- Score: 47.3713707521106
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Under mild conditions on the noise level of the measurements, rotation
averaging satisfies strong duality, which enables global solutions to be
obtained via semidefinite programming (SDP) relaxation. However, generic
solvers for SDP are rather slow in practice, even on rotation averaging
instances of moderate size, thus developing specialised algorithms is vital. In
this paper, we present a fast algorithm that achieves global optimality called
rotation coordinate descent (RCD). Unlike block coordinate descent (BCD) which
solves SDP by updating the semidefinite matrix in a row-by-row fashion, RCD
directly maintains and updates all valid rotations throughout the iterations.
This obviates the need to store a large dense semidefinite matrix. We
mathematically prove the convergence of our algorithm and empirically show its
superior efficiency over state-of-the-art global methods on a variety of
problem configurations. Maintaining valid rotations also facilitates
incorporating local optimisation routines for further speed-ups. Moreover, our
algorithm is simple to implement; see supplementary material for a
demonstration program.
Related papers
- Fast, Scalable, Warm-Start Semidefinite Programming with Spectral
Bundling and Sketching [53.91395791840179]
We present Unified Spectral Bundling with Sketching (USBS), a provably correct, fast and scalable algorithm for solving massive SDPs.
USBS provides a 500x speed-up over the state-of-the-art scalable SDP solver on an instance with over 2 billion decision variables.
arXiv Detail & Related papers (2023-12-19T02:27:22Z) - Accelerating Cutting-Plane Algorithms via Reinforcement Learning
Surrogates [49.84541884653309]
A current standard approach to solving convex discrete optimization problems is the use of cutting-plane algorithms.
Despite the existence of a number of general-purpose cut-generating algorithms, large-scale discrete optimization problems continue to suffer from intractability.
We propose a method for accelerating cutting-plane algorithms via reinforcement learning.
arXiv Detail & Related papers (2023-07-17T20:11:56Z) - Large-scale Optimization of Partial AUC in a Range of False Positive
Rates [51.12047280149546]
The area under the ROC curve (AUC) is one of the most widely used performance measures for classification models in machine learning.
We develop an efficient approximated gradient descent method based on recent practical envelope smoothing technique.
Our proposed algorithm can also be used to minimize the sum of some ranked range loss, which also lacks efficient solvers.
arXiv Detail & Related papers (2022-03-03T03:46:18Z) - Distributed stochastic optimization with large delays [59.95552973784946]
One of the most widely used methods for solving large-scale optimization problems is distributed asynchronous gradient descent (DASGD)
We show that DASGD converges to a global optimal implementation model under same delay assumptions.
arXiv Detail & Related papers (2021-07-06T21:59:49Z) - Momentum-inspired Low-Rank Coordinate Descent for Diagonally Constrained
SDPs [12.7944665592057]
We present a novel, practical, and provable approach for solving constrained semidefinite programming (SDP) problems at scale using accelerated non-trivial programming.
arXiv Detail & Related papers (2021-06-16T13:35:40Z) - Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex
Decentralized Optimization Over Time-Varying Networks [79.16773494166644]
We consider the task of minimizing the sum of smooth and strongly convex functions stored in a decentralized manner across the nodes of a communication network.
We design two optimal algorithms that attain these lower bounds.
We corroborate the theoretical efficiency of these algorithms by performing an experimental comparison with existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-08T15:54:44Z) - Efficient Algorithms for Rotation Averaging Problems [17.101725065752486]
The averaging problem is a fundamental task in computer applications.
We propose a block-based averaging algorithm with guaranteed convergence to stationary points.
We also propose an alternative averaging algorithm by applying upper-bound minimization.
arXiv Detail & Related papers (2021-03-18T05:22:45Z) - Hybrid Rotation Averaging: A Fast and Robust Rotation Averaging Approach [28.56388668402907]
We propose a hybrid rotation averaging approach to incremental Structure from Motion (SfM)
Global RAs ensure global optimality in low noise conditions, but they are inefficient and may easily deviate under the influence of outliers or elevated noise levels.
We demonstrate high practicality of the proposed method as bad camera poses are effectively corrected and drift is reduced.
arXiv Detail & Related papers (2021-01-22T14:11:19Z) - Shonan Rotation Averaging: Global Optimality by Surfing $SO(p)^n$ [26.686173666277725]
Shonan Rotation Averaging is guaranteed to recover globally optimal solutions under mild assumptions on the measurement noise.
Our method employs semidefinite relaxation in order to recover provably globally optimal solutions of the rotation averaging problem.
arXiv Detail & Related papers (2020-08-06T16:08:23Z) - Stochastic Coordinate Minimization with Progressive Precision for
Stochastic Convex Optimization [16.0251555430107]
A framework based on iterative coordinate minimization (CM) is developed for convex optimization.
We establish the optimal precision control and the resulting order-optimal regret performance.
The proposed algorithm is amenable to online implementation and inherits the scalability and parallelizability properties of CM for large-scale optimization.
arXiv Detail & Related papers (2020-03-11T18:42:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.