RAGO: Recurrent Graph Optimizer For Multiple Rotation Averaging
- URL: http://arxiv.org/abs/2212.07211v1
- Date: Wed, 14 Dec 2022 13:19:40 GMT
- Title: RAGO: Recurrent Graph Optimizer For Multiple Rotation Averaging
- Authors: Heng Li, Zhaopeng Cui, Shuaicheng Liu, Ping Tan
- Abstract summary: This paper proposes a deep recurrent Rotation Averaging Graph (RAGO) for Multiple Rotation Averaging (MRA)
Our framework is a real-time learning-to-optimize rotation averaging graph with a tiny size deployed for real-world applications.
- Score: 62.315673415889314
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper proposes a deep recurrent Rotation Averaging Graph Optimizer
(RAGO) for Multiple Rotation Averaging (MRA). Conventional optimization-based
methods usually fail to produce accurate results due to corrupted and noisy
relative measurements. Recent learning-based approaches regard MRA as a
regression problem, while these methods are sensitive to initialization due to
the gauge freedom problem. To handle these problems, we propose a learnable
iterative graph optimizer minimizing a gauge-invariant cost function with an
edge rectification strategy to mitigate the effect of inaccurate measurements.
Our graph optimizer iteratively refines the global camera rotations by
minimizing each node's single rotation objective function. Besides, our
approach iteratively rectifies relative rotations to make them more consistent
with the current camera orientations and observed relative rotations.
Furthermore, we employ a gated recurrent unit to improve the result by tracing
the temporal information of the cost graph. Our framework is a real-time
learning-to-optimize rotation averaging graph optimizer with a tiny size
deployed for real-world applications. RAGO outperforms previous traditional and
deep methods on real-world and synthetic datasets. The code is available at
https://github.com/sfu-gruvi-3dv/RAGO
Related papers
- Gravity-aligned Rotation Averaging with Circular Regression [53.81374943525774]
We introduce a principled approach that integrates gravity direction into the rotation averaging phase of global pipelines.
We achieve state-of-the-art accuracy on four large-scale datasets.
arXiv Detail & Related papers (2024-10-16T17:37:43Z) - Vanishing Point Estimation in Uncalibrated Images with Prior Gravity
Direction [82.72686460985297]
We tackle the problem of estimating a Manhattan frame.
We derive two new 2-line solvers, one of which does not suffer from singularities affecting existing solvers.
We also design a new non-minimal method, running on an arbitrary number of lines, to boost the performance in local optimization.
arXiv Detail & Related papers (2023-08-21T13:03:25Z) - Revisiting Rotation Averaging: Uncertainties and Robust Losses [51.64986160468128]
We argue that the main problem of current methods is the minimized cost function that is only weakly connected with the input data via the estimated epipolar.
We propose to better model the underlying noise distributions by directly propagating the uncertainty from the point correspondences into the rotation averaging.
arXiv Detail & Related papers (2023-03-09T11:51:20Z) - E-Graph: Minimal Solution for Rigid Rotation with Extensibility Graphs [61.552125054227595]
A new minimal solution is proposed to solve relative rotation estimation between two images without overlapping areas.
Based on E-Graph, the rotation estimation problem becomes simpler and more elegant.
We embed our rotation estimation strategy into a complete camera tracking and mapping system which obtains 6-DoF camera poses and a dense 3D mesh model.
arXiv Detail & Related papers (2022-07-20T16:11:48Z) - Smooth over-parameterized solvers for non-smooth structured optimization [3.756550107432323]
Non-smoothness encodes structural constraints on the solutions, such as sparsity, group sparsity, low-rank edges and sharp edges.
We operate a non-weighted but smooth overparametrization of the underlying nonsmooth optimization problems.
Our main contribution is to apply the Variable Projection (VarPro) which defines a new formulation by explicitly minimizing over part of the variables.
arXiv Detail & Related papers (2022-05-03T09:23:07Z) - Least Squares Regression with Markovian Data: Fundamental Limits and
Algorithms [69.45237691598774]
We study the problem of least squares linear regression where the data-points are dependent and are sampled from a Markov chain.
We establish sharp information theoretic minimax lower bounds for this problem in terms of $tau_mathsfmix$.
We propose an algorithm based on experience replay--a popular reinforcement learning technique--that achieves a significantly better error rate.
arXiv Detail & Related papers (2020-06-16T04:26:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.