Finding the Sparsest Vectors in a Subspace: Theory, Algorithms, and
Applications
- URL: http://arxiv.org/abs/2001.06970v1
- Date: Mon, 20 Jan 2020 04:48:43 GMT
- Title: Finding the Sparsest Vectors in a Subspace: Theory, Algorithms, and
Applications
- Authors: Qing Qu, Zhihui Zhu, Xiao Li, Manolis C. Tsakiris, John Wright, and
Ren\'e Vidal
- Abstract summary: The problem of finding the sparsest vector (direction) in a low dimensional subspace can be considered as a homogeneous variant of the sparse recovery problem.
The problem of finding the sparsest vector (direction) in a low dimensional subspace can be considered as a homogeneous variant of the sparse recovery problem.
- Score: 34.269044536641665
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The problem of finding the sparsest vector (direction) in a low dimensional
subspace can be considered as a homogeneous variant of the sparse recovery
problem, which finds applications in robust subspace recovery, dictionary
learning, sparse blind deconvolution, and many other problems in signal
processing and machine learning. However, in contrast to the classical sparse
recovery problem, the most natural formulation for finding the sparsest vector
in a subspace is usually nonconvex. In this paper, we overview recent advances
on global nonconvex optimization theory for solving this problem, ranging from
geometric analysis of its optimization landscapes, to efficient optimization
algorithms for solving the associated nonconvex optimization problem, to
applications in machine intelligence, representation learning, and imaging
sciences. Finally, we conclude this review by pointing out several interesting
open problems for future research.
Related papers
- On Probabilistic Embeddings in Optimal Dimension Reduction [1.2085509610251701]
Dimension reduction algorithms are a crucial part of many data science pipelines.
Despite their wide utilization, many non-linear dimension reduction algorithms are poorly understood from a theoretical perspective.
arXiv Detail & Related papers (2024-08-05T12:46:21Z) - A Guide to Stochastic Optimisation for Large-Scale Inverse Problems [4.926711494319977]
optimisation algorithms are the de facto standard for machine learning with large amounts of data.
We provide a comprehensive account of the state-of-the-art in optimisation from the viewpoint of inverse problems.
We focus on the challenges for optimisation that are unique and are not commonly encountered in machine learning.
arXiv Detail & Related papers (2024-06-10T15:02:30Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - On a class of geodesically convex optimization problems solved via
Euclidean MM methods [50.428784381385164]
We show how a difference of Euclidean convexization functions can be written as a difference of different types of problems in statistics and machine learning.
Ultimately, we helps the broader broader the broader the broader the broader the work.
arXiv Detail & Related papers (2022-06-22T23:57:40Z) - Noisy Low-rank Matrix Optimization: Geometry of Local Minima and
Convergence Rate [14.191310794366075]
This paper is concerned with low-rank matrix optimization, which has found a wide range of applications in machine learning.
We develop a framework that can deal with random corruptions to general objective functions, where the noise model is arbitrary.
We characterize the geometry of the spurious local minima of the problem in a local region around ground truth in the case when the RIP constant is greater than $1/3$.
arXiv Detail & Related papers (2022-03-08T07:44:47Z) - Analysis of Truncated Orthogonal Iteration for Sparse Eigenvector
Problems [78.95866278697777]
We propose two variants of the Truncated Orthogonal Iteration to compute multiple leading eigenvectors with sparsity constraints simultaneously.
We then apply our algorithms to solve the sparse principle component analysis problem for a wide range of test datasets.
arXiv Detail & Related papers (2021-03-24T23:11:32Z) - Why Do Local Methods Solve Nonconvex Problems? [54.284687261929115]
Non-used optimization is ubiquitous in modern machine learning.
We rigorously formalize it for instances of machine learning problems.
We hypothesize a unified explanation for this phenomenon.
arXiv Detail & Related papers (2021-03-24T19:34:11Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - Efficient Methods for Structured Nonconvex-Nonconcave Min-Max
Optimization [98.0595480384208]
We propose a generalization extraient spaces which converges to a stationary point.
The algorithm applies not only to general $p$-normed spaces, but also to general $p$-dimensional vector spaces.
arXiv Detail & Related papers (2020-10-31T21:35:42Z) - Accelerated Algorithms for Convex and Non-Convex Optimization on
Manifolds [9.632674803757475]
We propose a scheme for solving convex and non- optimization problems on distance.
Our proposed algorithm adapts to the level of complexity in the objective function.
arXiv Detail & Related papers (2020-10-18T02:48:22Z) - On Distributed Non-convex Optimization: Projected Subgradient Method For
Weakly Convex Problems in Networks [13.385373310554327]
The Moreau subgradient method converges linear sharpness problems in machine learning.
A distributed implementation of the subgradient method with a theoretical guarantee is proposed.
arXiv Detail & Related papers (2020-04-28T01:01:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.