A Spectral Algorithm for List-Decodable Covariance Estimation in
Relative Frobenius Norm
- URL: http://arxiv.org/abs/2305.00966v1
- Date: Mon, 1 May 2023 17:54:35 GMT
- Title: A Spectral Algorithm for List-Decodable Covariance Estimation in
Relative Frobenius Norm
- Authors: Ilias Diakonikolas, Daniel M. Kane, Jasper C. H. Lee, Ankit Pensia,
Thanasis Pittas
- Abstract summary: We produce a list of hypotheses that are close to $Sigma$ in relative Frobenius norm.
As a corollary, we obtain an efficient spectral algorithm for robust partial clustering of Gaussian mixture models.
Our new method yields the first Sum-of-Squares-free algorithm for robustly learning GMMs.
- Score: 41.03423042792559
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the problem of list-decodable Gaussian covariance estimation. Given
a multiset $T$ of $n$ points in $\mathbb R^d$ such that an unknown $\alpha<1/2$
fraction of points in $T$ are i.i.d. samples from an unknown Gaussian
$\mathcal{N}(\mu, \Sigma)$, the goal is to output a list of $O(1/\alpha)$
hypotheses at least one of which is close to $\Sigma$ in relative Frobenius
norm. Our main result is a $\mathrm{poly}(d,1/\alpha)$ sample and time
algorithm for this task that guarantees relative Frobenius norm error of
$\mathrm{poly}(1/\alpha)$. Importantly, our algorithm relies purely on spectral
techniques. As a corollary, we obtain an efficient spectral algorithm for
robust partial clustering of Gaussian mixture models (GMMs) -- a key ingredient
in the recent work of [BDJ+22] on robustly learning arbitrary GMMs. Combined
with the other components of [BDJ+22], our new method yields the first
Sum-of-Squares-free algorithm for robustly learning GMMs. At the technical
level, we develop a novel multi-filtering method for list-decodable covariance
estimation that may be useful in other settings.
Related papers
- Dimension Reduction via Sum-of-Squares and Improved Clustering Algorithms for Non-Spherical Mixtures [5.668124846154999]
We develop a new approach for clustering non-spherical (i.e., arbitrary component covariances) Gaussian mixture models via a subroutine.
Our method gives a non-spherical analog of the classical dimension reduction, based on singular value decomposition.
Our algorithms naturally extend to tolerating a dimension-independent fraction of arbitrary outliers.
arXiv Detail & Related papers (2024-11-19T11:58:51Z) - Sum-of-squares lower bounds for Non-Gaussian Component Analysis [33.80749804695003]
Non-Gaussian Component Analysis (NGCA) is the statistical task of finding a non-Gaussian direction in a high-dimensional dataset.
Here we study the complexity of NGCA in the Sum-of-Squares framework.
arXiv Detail & Related papers (2024-10-28T18:19:13Z) - Private estimation algorithms for stochastic block models and mixture
models [63.07482515700984]
General tools for designing efficient private estimation algorithms.
First efficient $(epsilon, delta)$-differentially private algorithm for both weak recovery and exact recovery.
arXiv Detail & Related papers (2023-01-11T09:12:28Z) - List-Decodable Sparse Mean Estimation via Difference-of-Pairs Filtering [42.526664955704746]
We develop a novel, conceptually simpler technique for list-decodable sparse mean estimation.
In particular, for distributions with "certifiably bounded" $t-th moments in $k$-sparse directions, our algorithm achieves error of $(1/alpha)O (1/t)$ with sample complexity $m = (klog(n))O(t)/alpha(mnt)$.
For the special case of Gaussian inliers, our algorithm achieves the optimal error guarantee of $Theta (sqrtlog
arXiv Detail & Related papers (2022-06-10T17:38:18Z) - Robust Sparse Mean Estimation via Sum of Squares [42.526664955704746]
We study the problem of high-dimensional sparse mean estimation in the presence of an $epsilon$-fraction of adversarial outliers.
Our algorithms follow the Sum-of-Squares based, to algorithms approach.
arXiv Detail & Related papers (2022-06-07T16:49:54Z) - Clustering Mixture Models in Almost-Linear Time via List-Decodable Mean
Estimation [58.24280149662003]
We study the problem of list-decodable mean estimation, where an adversary can corrupt a majority of the dataset.
We develop new algorithms for list-decodable mean estimation, achieving nearly-optimal statistical guarantees.
arXiv Detail & Related papers (2021-06-16T03:34:14Z) - Robust Model Selection and Nearly-Proper Learning for GMMs [26.388358539260473]
In learning theory, a standard assumption is that the data is generated from a finite mixture model. But what happens when the number of components is not known in advance?
We are able to approximately determine the minimum number of components needed to fit the distribution within a logarithmic factor.
arXiv Detail & Related papers (2021-06-05T01:58:40Z) - List-Decodable Mean Estimation in Nearly-PCA Time [50.79691056481693]
We study the fundamental task of list-decodable mean estimation in high dimensions.
Our algorithm runs in time $widetildeO(ndk)$ for all $k = O(sqrtd) cup Omega(d)$, where $n$ is the size of the dataset.
A variant of our algorithm has runtime $widetildeO(ndk)$ for all $k$, at the expense of an $O(sqrtlog k)$ factor in the recovery guarantee
arXiv Detail & Related papers (2020-11-19T17:21:37Z) - Optimal Robust Linear Regression in Nearly Linear Time [97.11565882347772]
We study the problem of high-dimensional robust linear regression where a learner is given access to $n$ samples from the generative model $Y = langle X,w* rangle + epsilon$
We propose estimators for this problem under two settings: (i) $X$ is L4-L2 hypercontractive, $mathbbE [XXtop]$ has bounded condition number and $epsilon$ has bounded variance and (ii) $X$ is sub-Gaussian with identity second moment and $epsilon$ is
arXiv Detail & Related papers (2020-07-16T06:44:44Z) - Robustly Learning any Clusterable Mixture of Gaussians [55.41573600814391]
We study the efficient learnability of high-dimensional Gaussian mixtures in the adversarial-robust setting.
We provide an algorithm that learns the components of an $epsilon$-corrupted $k$-mixture within information theoretically near-optimal error proofs of $tildeO(epsilon)$.
Our main technical contribution is a new robust identifiability proof clusters from a Gaussian mixture, which can be captured by the constant-degree Sum of Squares proof system.
arXiv Detail & Related papers (2020-05-13T16:44:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.