Computing stable resultant-based minimal solvers by hiding a variable
- URL: http://arxiv.org/abs/2007.10100v1
- Date: Fri, 17 Jul 2020 07:40:10 GMT
- Title: Computing stable resultant-based minimal solvers by hiding a variable
- Authors: Snehal Bhayani, Zuzana Kukelova and Janne Heikkil\"a
- Abstract summary: Computer vision applications require robust estimation of camera geometry from a minimal number of input data measurements.
In this paper, we study an interesting alternative sparse-based method for solving sparse systems of equations by hiding one variable.
We show that for the studied problems, our resultant based approach leads to more stable solvers than the state-of-the-art Gr"obner bases-based solvers.
- Score: 20.402488757146692
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many computer vision applications require robust and efficient estimation of
camera geometry. The robust estimation is usually based on solving camera
geometry problems from a minimal number of input data measurements, i.e.,
solving minimal problems, in a RANSAC-style framework. Minimal problems often
result in complex systems of polynomial equations. The existing
state-of-the-art methods for solving such systems are either based on Gr\"obner
bases and the action matrix method, which have been extensively studied and
optimized in the recent years or recently proposed approach based on a sparse
resultant computation using an extra variable.
In this paper, we study an interesting alternative sparse resultant-based
method for solving sparse systems of polynomial equations by hiding one
variable. This approach results in a larger eigenvalue problem than the action
matrix and extra variable sparse resultant-based methods; however, it does not
need to compute an inverse or elimination of large matrices that may be
numerically unstable. The proposed approach includes several improvements to
the standard sparse resultant algorithms, which significantly improves the
efficiency and stability of the hidden variable resultant-based solvers as we
demonstrate on several interesting computer vision problems. We show that for
the studied problems, our sparse resultant based approach leads to more stable
solvers than the state-of-the-art Gr\"obner bases-based solvers as well as
existing sparse resultant-based solvers, especially in close to critical
configurations. Our new method can be fully automated and incorporated into
existing tools for the automatic generation of efficient minimal solvers.
Related papers
- Optimizing Solution-Samplers for Combinatorial Problems: The Landscape
of Policy-Gradient Methods [52.0617030129699]
We introduce a novel theoretical framework for analyzing the effectiveness of DeepMatching Networks and Reinforcement Learning methods.
Our main contribution holds for a broad class of problems including Max-and Min-Cut, Max-$k$-Bipartite-Bi, Maximum-Weight-Bipartite-Bi, and Traveling Salesman Problem.
As a byproduct of our analysis we introduce a novel regularization process over vanilla descent and provide theoretical and experimental evidence that it helps address vanishing-gradient issues and escape bad stationary points.
arXiv Detail & Related papers (2023-10-08T23:39:38Z) - Sparse resultant based minimal solvers in computer vision and their
connection with the action matrix [17.31412310131552]
We show that for some camera geometry problems our extra-based method leads to smaller and more stable solvers than the state-of-the-art Grobner basis-based solvers.
It provides a competitive alternative to popularner basis-based methods for minimal problems in computer vision.
arXiv Detail & Related papers (2023-01-16T14:25:19Z) - Sparse Polynomial Optimization: Theory and Practice [5.27013884159732]
Book presents several efforts to tackle this challenge with important scientific implications.
It provides alternative optimization schemes that scale well in terms of computational complexity.
We present sparsity-exploiting hierarchies of relaxations, for either unconstrained or constrained problems.
arXiv Detail & Related papers (2022-08-23T18:56:05Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Minimax Optimization: The Case of Convex-Submodular [50.03984152441271]
Minimax problems extend beyond the continuous domain to mixed continuous-discrete domains or even fully discrete domains.
We introduce the class of convex-submodular minimax problems, where the objective is convex with respect to the continuous variable and submodular with respect to the discrete variable.
Our proposed algorithms are iterative and combine tools from both discrete and continuous optimization.
arXiv Detail & Related papers (2021-11-01T21:06:35Z) - Weighted Low Rank Matrix Approximation and Acceleration [0.5177947445379687]
Low-rank matrix approximation is one of the central concepts in machine learning.
Low-rank matrix completion (LRMC) solves the LRMA problem when some observations are missing.
We propose an algorithm for solving the weighted problem, as well as two acceleration techniques.
arXiv Detail & Related papers (2021-09-22T22:03:48Z) - Efficient Large Scale Inlier Voting for Geometric Vision Problems [3.1231364554255636]
Outlier rejection and equivalently inlier set optimization is a key ingredient in numerous applications in computer vision.
We present an efficient and general algorithm for outlier rejection based on "intersecting" $k$-dimensional surfaces in $Rd$.
We demonstrate the versatility of the approach on several camera posing problems with a high number of matches at low inlier ratio.
arXiv Detail & Related papers (2021-07-25T14:13:07Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - Computational Barriers to Estimation from Low-Degree Polynomials [81.67886161671379]
We study the power of low-degrees for the task of detecting the presence of hidden structures.
For a large class of "signal plus noise" problems, we give a user-friendly lower bound for the best possible mean squared error achievable by any degree.
As applications, we give a tight characterization of the low-degree minimum mean squared error for the planted submatrix and planted dense subgraph problems.
arXiv Detail & Related papers (2020-08-05T17:52:10Z) - Hybrid Variance-Reduced SGD Algorithms For Nonconvex-Concave Minimax
Problems [26.24895953952318]
We develop an algorithm to solve a class of non-gence minimax problems.
They can also work with both single or two mini-batch derivatives.
arXiv Detail & Related papers (2020-06-27T03:05:18Z) - An Online Method for A Class of Distributionally Robust Optimization
with Non-Convex Objectives [54.29001037565384]
We propose a practical online method for solving a class of online distributionally robust optimization (DRO) problems.
Our studies demonstrate important applications in machine learning for improving the robustness of networks.
arXiv Detail & Related papers (2020-06-17T20:19:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.