Sparse Quadratic Optimisation over the Stiefel Manifold with Application
to Permutation Synchronisation
- URL: http://arxiv.org/abs/2110.00053v1
- Date: Thu, 30 Sep 2021 19:17:35 GMT
- Title: Sparse Quadratic Optimisation over the Stiefel Manifold with Application
to Permutation Synchronisation
- Authors: Florian Bernard, Daniel Cremers, Johan Thunberg
- Abstract summary: We address the non- optimisation problem of finding a matrix on the Stiefel manifold that maximises a quadratic objective function.
We propose a simple yet effective sparsity-promoting algorithm for finding the dominant eigenspace matrix.
- Score: 71.27989298860481
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We address the non-convex optimisation problem of finding a sparse matrix on
the Stiefel manifold (matrices with mutually orthogonal columns of unit length)
that maximises (or minimises) a quadratic objective function. Optimisation
problems on the Stiefel manifold occur for example in spectral relaxations of
various combinatorial problems, such as graph matching, clustering, or
permutation synchronisation. Although sparsity is a desirable property in such
settings, it is mostly neglected in spectral formulations since existing
solvers, e.g. based on eigenvalue decomposition, are unable to account for
sparsity while at the same time maintaining global optimality guarantees. We
fill this gap and propose a simple yet effective sparsity-promoting
modification of the Orthogonal Iteration algorithm for finding the dominant
eigenspace of a matrix. By doing so, we can guarantee that our method finds a
Stiefel matrix that is globally optimal with respect to the quadratic objective
function, while in addition being sparse. As a motivating application we
consider the task of permutation synchronisation, which can be understood as a
constrained clustering problem that has particular relevance for matching
multiple images or 3D shapes in computer vision, computer graphics, and beyond.
We demonstrate that the proposed approach outperforms previous methods in this
domain.
Related papers
- Towards Efficient Pareto-optimal Utility-Fairness between Groups in
Repeated Rankings [7.6275971668447005]
We tackle the problem of computing a sequence of rankings with the guarantee of the Pareto-optimal balance between consumers and producers of items.
We introduce a novel approach to the above problem by using the Expohedron - a permutahedron whose points represent all exposures of items.
We further propose an efficient method by relaxing our optimization problem on the Expohedron's circumscribed $n$-sphere, which significantly improve the running time.
arXiv Detail & Related papers (2024-02-22T05:48:54Z) - Diff-PCR: Diffusion-Based Correspondence Searching in Doubly Stochastic
Matrix Space for Point Cloud Registration [35.82753072083472]
State-of-the-art methods have employed RAFT-like iterative updates to refine the solution.
We propose a novel approach that exploits the Denoising Diffusion Model to predict a searching for the optimal matching matrix.
Our method offers flexibility by allowing the search to start from any initial matching matrix provided by the online backbone or white noise.
arXiv Detail & Related papers (2023-12-31T09:24:28Z) - Kernelized multi-graph matching [0.0]
We introduce a novel kernelized multigraph matching technique that handles both the attributes of the pair and the edges of the graphs.
We propose several projectors leading to improved stability of the results.
arXiv Detail & Related papers (2022-10-11T07:22:47Z) - Semi-Supervised Subspace Clustering via Tensor Low-Rank Representation [64.49871502193477]
We propose a novel semi-supervised subspace clustering method, which is able to simultaneously augment the initial supervisory information and construct a discriminative affinity matrix.
Comprehensive experimental results on six commonly-used benchmark datasets demonstrate the superiority of our method over state-of-the-art methods.
arXiv Detail & Related papers (2022-05-21T01:47:17Z) - Nonlinear matrix recovery using optimization on the Grassmann manifold [18.655422834567577]
We investigate the problem of recovering a partially observed high-rank clustering matrix whose columns obey a nonlinear structure such as a union of subspaces.
We show that the alternating limit converges to a unique point using the Kurdyka-Lojasi property.
arXiv Detail & Related papers (2021-09-13T16:13:13Z) - Implicit differentiation for fast hyperparameter selection in non-smooth
convex learning [87.60600646105696]
We study first-order methods when the inner optimization problem is convex but non-smooth.
We show that the forward-mode differentiation of proximal gradient descent and proximal coordinate descent yield sequences of Jacobians converging toward the exact Jacobian.
arXiv Detail & Related papers (2021-05-04T17:31:28Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - Hybrid Trilinear and Bilinear Programming for Aligning Partially
Overlapping Point Sets [85.71360365315128]
In many applications, we need algorithms which can align partially overlapping point sets are invariant to the corresponding corresponding RPM algorithm.
We first show that the objective is a cubic bound function. We then utilize the convex envelopes of trilinear and bilinear monomial transformations to derive its lower bound.
We next develop a branch-and-bound (BnB) algorithm which only branches over the transformation variables and runs efficiently.
arXiv Detail & Related papers (2021-01-19T04:24:23Z) - Mixed-Projection Conic Optimization: A New Paradigm for Modeling Rank
Constraints [3.179831861897336]
We provide a framework for solving low-rank optimization problems to certifiable optimality.
Our framework also provides near-optimal solutions through rounding and local search techniques.
arXiv Detail & Related papers (2020-09-22T08:59:06Z) - Multi-Objective Matrix Normalization for Fine-grained Visual Recognition [153.49014114484424]
Bilinear pooling achieves great success in fine-grained visual recognition (FGVC)
Recent methods have shown that the matrix power normalization can stabilize the second-order information in bilinear features.
We propose an efficient Multi-Objective Matrix Normalization (MOMN) method that can simultaneously normalize a bilinear representation.
arXiv Detail & Related papers (2020-03-30T08:40:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.