Generalized Orthogonal Procrustes Problem under Arbitrary Adversaries
- URL: http://arxiv.org/abs/2106.15493v2
- Date: Sat, 13 Jan 2024 16:44:36 GMT
- Title: Generalized Orthogonal Procrustes Problem under Arbitrary Adversaries
- Authors: Shuyang Ling
- Abstract summary: We study the semidefinite relaxation (SDR) and an iterative method named generalized power method (GPM) to find the least squares estimator.
In addition, we analyze the low-rank factorization algorithm and show the corresponding optimization landscape is free of spurious local minimizers.
- Score: 1.0152838128195467
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The generalized orthogonal Procrustes problem (GOPP) plays a fundamental role
in several scientific disciplines including statistics, imaging science and
computer vision. Despite its tremendous practical importance, it is generally
an NP-hard problem to find the least squares estimator. We study the
semidefinite relaxation (SDR) and an iterative method named generalized power
method (GPM) to find the least squares estimator, and investigate the
performance under a signal-plus-noise model. We show that the SDR recovers the
least squares estimator exactly and moreover the generalized power method with
a proper initialization converges linearly to the global minimizer to the SDR,
provided that the signal-to-noise ratio is large. The main technique follows
from showing the nonlinear mapping involved in the GPM is essentially a local
contraction mapping and then applying the well-known Banach fixed-point theorem
finishes the proof. In addition, we analyze the low-rank factorization
algorithm and show the corresponding optimization landscape is free of spurious
local minimizers under nearly identical conditions that enables the success of
SDR approach. The highlight of our work is that the theoretical guarantees are
purely algebraic and do not assume any statistical priors of the additive
adversaries, and thus it applies to various interesting settings.
Related papers
- A Unified Theory of Stochastic Proximal Point Methods without Smoothness [52.30944052987393]
Proximal point methods have attracted considerable interest owing to their numerical stability and robustness against imperfect tuning.
This paper presents a comprehensive analysis of a broad range of variations of the proximal point method (SPPM)
arXiv Detail & Related papers (2024-05-24T21:09:19Z) - On the Estimation Performance of Generalized Power Method for
Heteroscedastic Probabilistic PCA [21.9585534723895]
We show that, given a suitable iterate between the GPMs generated by at least geometrically bound to some threshold, the GPMs decrease to some threshold with the residual part of certain "-resi decomposition"
In this paper, we demonstrate the superior performance with sub-Gaussian noise settings using the PCA technique.
arXiv Detail & Related papers (2023-12-06T11:41:17Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Asymptotically Unbiased Instance-wise Regularized Partial AUC
Optimization: Theory and Algorithm [101.44676036551537]
One-way Partial AUC (OPAUC) and Two-way Partial AUC (TPAUC) measures the average performance of a binary classifier.
Most of the existing methods could only optimize PAUC approximately, leading to inevitable biases that are not controllable.
We present a simpler reformulation of the PAUC problem via distributional robust optimization AUC.
arXiv Detail & Related papers (2022-10-08T08:26:22Z) - Lattice partition recovery with dyadic CART [79.96359947166592]
We study piece-wise constant signals corrupted by additive Gaussian noise over a $d$-dimensional lattice.
Data of this form naturally arise in a host of applications, and the tasks of signal detection or testing, de-noising and estimation have been studied extensively in the statistical and signal processing literature.
In this paper we consider instead the problem of partition recovery, i.e.of estimating the partition of the lattice induced by the constancy regions of the unknown signal.
We prove that a DCART-based procedure consistently estimates the underlying partition at a rate of order $sigma2 k*
arXiv Detail & Related papers (2021-05-27T23:41:01Z) - A Retrospective Approximation Approach for Smooth Stochastic
Optimization [0.2867517731896504]
Gradient (SG) is the defactorandom iterative technique to solve optimization (SO) problems with a smooth (non-fimation) objective $imation.
arXiv Detail & Related papers (2021-03-07T16:29:36Z) - Proximal and Federated Random Reshuffling [11.83842808044211]
We propose two new algorithms for Random Reshuffling.
ProxRR and FedRR solve composite convex finite-sum minimization problems.
ProxRR is faster than algorithms that evaluate the proximal operator in every iteration.
arXiv Detail & Related papers (2021-02-12T18:59:24Z) - Continuous Wasserstein-2 Barycenter Estimation without Minimax
Optimization [94.18714844247766]
Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.
We present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures.
arXiv Detail & Related papers (2021-02-02T21:01:13Z) - On Projection Robust Optimal Transport: Sample Complexity and Model
Misspecification [101.0377583883137]
Projection robust (PR) OT seeks to maximize the OT cost between two measures by choosing a $k$-dimensional subspace onto which they can be projected.
Our first contribution is to establish several fundamental statistical properties of PR Wasserstein distances.
Next, we propose the integral PR Wasserstein (IPRW) distance as an alternative to the PRW distance, by averaging rather than optimizing on subspaces.
arXiv Detail & Related papers (2020-06-22T14:35:33Z) - Fundamental Limits of Ridge-Regularized Empirical Risk Minimization in
High Dimensions [41.7567932118769]
Empirical Risk Minimization algorithms are widely used in a variety of estimation and prediction tasks.
In this paper, we characterize for the first time the fundamental limits on the statistical accuracy of convex ERM for inference.
arXiv Detail & Related papers (2020-06-16T04:27:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.