Efficient Algorithms for Sparse Moment Problems without Separation
- URL: http://arxiv.org/abs/2207.13008v2
- Date: Sun, 23 Jul 2023 04:52:37 GMT
- Title: Efficient Algorithms for Sparse Moment Problems without Separation
- Authors: Zhiyuan Fan and Jian Li
- Abstract summary: We consider the sparse moment problem of learning a $k$-spike mixture in high-dimensional space from noisy moment information.
Previous algorithms either assume certain separation assumptions, use more recovery moments, or run in (super) exponential time.
Our algorithm for the high-dimensional case determines the coordinates of each spike by aligning a 1d projection of the mixture to a random vector and a set of 2d projections of the mixture.
- Score: 6.732901486505047
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider the sparse moment problem of learning a $k$-spike mixture in
high-dimensional space from its noisy moment information in any dimension. We
measure the accuracy of the learned mixtures using transportation distance.
Previous algorithms either assume certain separation assumptions, use more
recovery moments, or run in (super) exponential time. Our algorithm for the
one-dimensional problem (also called the sparse Hausdorff moment problem) is a
robust version of the classic Prony's method, and our contribution mainly lies
in the analysis. We adopt a global and much tighter analysis than previous work
(which analyzes the perturbation of the intermediate results of Prony's
method). A useful technical ingredient is a connection between the linear
system defined by the Vandermonde matrix and the Schur polynomial, which allows
us to provide tight perturbation bound independent of the separation and may be
useful in other contexts. To tackle the high-dimensional problem, we first
solve the two-dimensional problem by extending the one-dimensional algorithm
and analysis to complex numbers. Our algorithm for the high-dimensional case
determines the coordinates of each spike by aligning a 1d projection of the
mixture to a random vector and a set of 2d projections of the mixture. Our
results have applications to learning topic models and Gaussian mixtures,
implying improved sample complexity results or running time over prior work.
Related papers
- A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - Fast Optimal Locally Private Mean Estimation via Random Projections [58.603579803010796]
We study the problem of locally private mean estimation of high-dimensional vectors in the Euclidean ball.
We propose a new algorithmic framework, ProjUnit, for private mean estimation.
Our framework is deceptively simple: each randomizer projects its input to a random low-dimensional subspace, normalizes the result, and then runs an optimal algorithm.
arXiv Detail & Related papers (2023-06-07T14:07:35Z) - Sublinear Time Algorithms for Several Geometric Optimization (With
Outliers) Problems In Machine Learning [8.794896028549323]
We revisit the Minimum Enclosing Ball (MEB) problem in Euclidean space $mathbbRd$.
We introduce the notion of stability for MEB, which is natural and easy to understand.
Under the stability assumption, we present two sampling algorithms for computing radius-approximate MEB.
arXiv Detail & Related papers (2023-01-07T15:03:45Z) - Covariance-Free Sparse Bayesian Learning [62.24008859844098]
We introduce a new SBL inference algorithm that avoids explicit inversions of the covariance matrix.
Our method can be up to thousands of times faster than existing baselines.
We showcase how our new algorithm enables SBL to tractably tackle high-dimensional signal recovery problems.
arXiv Detail & Related papers (2021-05-21T16:20:07Z) - Analysis of Truncated Orthogonal Iteration for Sparse Eigenvector
Problems [78.95866278697777]
We propose two variants of the Truncated Orthogonal Iteration to compute multiple leading eigenvectors with sparsity constraints simultaneously.
We then apply our algorithms to solve the sparse principle component analysis problem for a wide range of test datasets.
arXiv Detail & Related papers (2021-03-24T23:11:32Z) - PCA Reduced Gaussian Mixture Models with Applications in Superresolution [1.885698488789676]
This paper provides a twofold contribution to the topic.
First, we propose a Gaussian Mixture Model in conjunction with a reduction of the dimensionality of the data in each component of the model.
Second, we apply our PCA-GMM for the superresolution of 2D and 3D material images based on the approach of Sandeep and Jacob.
arXiv Detail & Related papers (2020-09-16T07:33:56Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - Efficient Algorithms for Multidimensional Segmented Regression [42.046881924063044]
We study the fundamental problem of fixed design em multidimensional regression.
We provide the first sample and computationally efficient algorithm for this problem in any fixed dimension.
Our algorithm relies on a simple merging iterative approach, which is novel in the multidimensional setting.
arXiv Detail & Related papers (2020-03-24T19:39:34Z) - Optimal Randomized First-Order Methods for Least-Squares Problems [56.05635751529922]
This class of algorithms encompasses several randomized methods among the fastest solvers for least-squares problems.
We focus on two classical embeddings, namely, Gaussian projections and subsampled Hadamard transforms.
Our resulting algorithm yields the best complexity known for solving least-squares problems with no condition number dependence.
arXiv Detail & Related papers (2020-02-21T17:45:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.