Symmetrized Robust Procrustes: Constant-Factor Approximation and Exact
Recovery
- URL: http://arxiv.org/abs/2207.08592v1
- Date: Mon, 18 Jul 2022 13:32:20 GMT
- Title: Symmetrized Robust Procrustes: Constant-Factor Approximation and Exact
Recovery
- Authors: Tal Amir, Shahar Kovalsky, Nadav Dym
- Abstract summary: We propose a novel convex relaxation for the Robust Procrustes problem.
We find that our method performs similarly to the standard Iteratively Reweighted Least Squares.
- Score: 6.157087562708549
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The classical $\textit{Procrustes}$ problem is to find a rigid motion
(orthogonal transformation and translation) that best aligns two given
point-sets in the least-squares sense. The $\textit{Robust Procrustes}$ problem
is an important variant, in which a power-1 objective is used instead of least
squares to improve robustness to outliers. While the optimal solution of the
least-squares problem can be easily computed in closed form, dating back to
Sch\"onemann (1966), no such solution is known for the power-1 problem. In this
paper we propose a novel convex relaxation for the Robust Procrustes problem.
Our relaxation enjoys several theoretical and practical advantages:
Theoretically, we prove that our method provides a $\sqrt{2}$-factor
approximation to the Robust Procrustes problem, and that, under appropriate
assumptions, it exactly recovers the true rigid motion from point
correspondences contaminated by outliers. In practice, we find in numerical
experiments on both synthetic and real robust Procrustes problems, that our
method performs similarly to the standard Iteratively Reweighted Least Squares
(IRLS). However the convexity of our algorithm allows incorporating additional
convex penalties, which are not readily amenable to IRLS. This turns out to be
a substantial advantage, leading to improved results in high-dimensional
problems, including non-rigid shape alignment and semi-supervised interlingual
word translation.
Related papers
- Trust-Region Sequential Quadratic Programming for Stochastic Optimization with Random Models [57.52124921268249]
We propose a Trust Sequential Quadratic Programming method to find both first and second-order stationary points.
To converge to first-order stationary points, our method computes a gradient step in each iteration defined by minimizing a approximation of the objective subject.
To converge to second-order stationary points, our method additionally computes an eigen step to explore the negative curvature the reduced Hessian matrix.
arXiv Detail & Related papers (2024-09-24T04:39:47Z) - Riemannian stochastic optimization methods avoid strict saddle points [68.80251170757647]
We show that policies under study avoid strict saddle points / submanifolds with probability 1.
This result provides an important sanity check as it shows that, almost always, the limit state of an algorithm can only be a local minimizer.
arXiv Detail & Related papers (2023-11-04T11:12:24Z) - Stochastic Optimization for Non-convex Problem with Inexact Hessian
Matrix, Gradient, and Function [99.31457740916815]
Trust-region (TR) and adaptive regularization using cubics have proven to have some very appealing theoretical properties.
We show that TR and ARC methods can simultaneously provide inexact computations of the Hessian, gradient, and function values.
arXiv Detail & Related papers (2023-10-18T10:29:58Z) - A Block Coordinate Descent Method for Nonsmooth Composite Optimization
under Orthogonality Constraints [7.716156977428555]
Nonsmooth composite optimization with generality constraints has a broad spectrum of applications in statistical learning and data science.
textittextbfOBCD is a new Block Coordinate method for solving nonsmooth composite problems.
arXiv Detail & Related papers (2023-04-07T13:44:59Z) - Infeasible Deterministic, Stochastic, and Variance-Reduction Algorithms for Optimization under Orthogonality Constraints [9.301728976515255]
This article provides new practical and theoretical developments for the landing algorithm.
First, the method is extended to the Stiefel manifold.
We also consider variance reduction algorithms when the cost function is an average of many functions.
arXiv Detail & Related papers (2023-03-29T07:36:54Z) - Pseudonorm Approachability and Applications to Regret Minimization [73.54127663296906]
We convert high-dimensional $ell_infty$-approachability problems to low-dimensional pseudonorm approachability problems.
We develop an algorithmic theory of pseudonorm approachability, analogous to previous work on approachability for $ell$ and other norms.
arXiv Detail & Related papers (2023-02-03T03:19:14Z) - Hybrid Trilinear and Bilinear Programming for Aligning Partially
Overlapping Point Sets [85.71360365315128]
In many applications, we need algorithms which can align partially overlapping point sets are invariant to the corresponding corresponding RPM algorithm.
We first show that the objective is a cubic bound function. We then utilize the convex envelopes of trilinear and bilinear monomial transformations to derive its lower bound.
We next develop a branch-and-bound (BnB) algorithm which only branches over the transformation variables and runs efficiently.
arXiv Detail & Related papers (2021-01-19T04:24:23Z) - Efficient Projection-Free Algorithms for Saddle Point Problems [39.88460595129901]
We study projection-free algorithms for convex-strongly-concave saddle point problems with complicated constraints.
Our method combines Conditional Gradient Sliding with Mirror-Prox and shows that it only requires $tildeO (1/sqrtepsilon)$ evaluations and $tildeO (1/epsilon2)$ linear optimizations in the batch setting.
arXiv Detail & Related papers (2020-10-21T15:05:53Z) - Gradient-Free Methods for Saddle-Point Problem [125.99533416395765]
We generalize the approach Gasnikov et. al., 2017, which allows to solve (stochastic) convex optimization problems with an inexact gradient-free oracle.
Our approach reduces $fracnlog n$ times the required number of oracle calls.
In the second part of the paper, we analyze the case when such an assumption cannot be made, we propose a general approach on how to modernize the method to solve this problem.
arXiv Detail & Related papers (2020-05-12T16:44:27Z) - Quasi-Newton Solver for Robust Non-Rigid Registration [35.66014845211251]
We propose a formulation for robust non-rigid registration based on a globally smooth robust estimator for data fitting and regularization.
We apply the majorization-minimization algorithm to the problem, which reduces each iteration to solving a simple least-squares problem with L-BFGS.
arXiv Detail & Related papers (2020-04-09T01:45:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.