The Parametric Stability of Well-separated Spherical Gaussian Mixtures
- URL: http://arxiv.org/abs/2302.00242v1
- Date: Wed, 1 Feb 2023 04:52:13 GMT
- Title: The Parametric Stability of Well-separated Spherical Gaussian Mixtures
- Authors: Hanyu Zhang, Marina Meila
- Abstract summary: We quantify the parameter stability of a spherical Gaussian Mixture Model (sGMM) under small perturbations in distribution space.
We derive the first explicit bound to show that for a mixture of spherical Gaussian $P$ (sGMM) in a pre-defined model class, all other sGMM close to $P in this model class in total variation distance has a small parameter distance to $P.
- Score: 7.238973585403367
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We quantify the parameter stability of a spherical Gaussian Mixture Model
(sGMM) under small perturbations in distribution space. Namely, we derive the
first explicit bound to show that for a mixture of spherical Gaussian $P$
(sGMM) in a pre-defined model class, all other sGMM close to $P$ in this model
class in total variation distance has a small parameter distance to $P$.
Further, this upper bound only depends on $P$. The motivation for this work
lies in providing guarantees for fitting Gaussian mixtures; with this aim in
mind, all the constants involved are well defined and distribution free
conditions for fitting mixtures of spherical Gaussians. Our results tighten
considerably the existing computable bounds, and asymptotically match the known
sharp thresholds for this problem.
Related papers
- Toward Global Convergence of Gradient EM for Over-Parameterized Gaussian Mixture Models [47.294535652946095]
We study the gradient Expectation-Maximization (EM) algorithm for Gaussian Mixture Models (GMM)
This is the first global convergence result for Gaussian mixtures with more than $2$ components.
arXiv Detail & Related papers (2024-06-29T16:44:29Z) - Theoretical Guarantees for Variational Inference with Fixed-Variance Mixture of Gaussians [27.20127082606962]
Variational inference (VI) is a popular approach in Bayesian inference.
This work aims to contribute to the theoretical study of VI in the non-Gaussian case.
arXiv Detail & Related papers (2024-06-06T12:38:59Z) - Riemannian optimization for non-centered mixture of scaled Gaussian
distributions [17.855338784378]
This paper studies the statistical model of the non-centered mixture of scaled Gaussian distributions (NC-MSG)
Using the Fisher-Rao information geometry associated to this distribution, we derive a Riemannian gradient descent algorithm.
A Nearest centroid classifier is implemented leveraging the KL divergence and its associated center of mass.
arXiv Detail & Related papers (2022-09-07T17:22:20Z) - Theoretical Error Analysis of Entropy Approximation for Gaussian Mixture [0.7499722271664147]
In this paper, we analyze the approximation error between the true entropy and the approximate one to reveal when this approximation works effectively.
Our results provide a guarantee that this approximation works well in higher dimension problems.
arXiv Detail & Related papers (2022-02-26T04:49:01Z) - The Schr\"odinger Bridge between Gaussian Measures has a Closed Form [101.79851806388699]
We focus on the dynamic formulation of OT, also known as the Schr"odinger bridge (SB) problem.
In this paper, we provide closed-form expressions for SBs between Gaussian measures.
arXiv Detail & Related papers (2022-02-11T15:59:01Z) - A Robust and Flexible EM Algorithm for Mixtures of Elliptical
Distributions with Missing Data [71.9573352891936]
This paper tackles the problem of missing data imputation for noisy and non-Gaussian data.
A new EM algorithm is investigated for mixtures of elliptical distributions with the property of handling potential missing data.
Experimental results on synthetic data demonstrate that the proposed algorithm is robust to outliers and can be used with non-Gaussian data.
arXiv Detail & Related papers (2022-01-28T10:01:37Z) - Spectral clustering under degree heterogeneity: a case for the random
walk Laplacian [83.79286663107845]
This paper shows that graph spectral embedding using the random walk Laplacian produces vector representations which are completely corrected for node degree.
In the special case of a degree-corrected block model, the embedding concentrates about K distinct points, representing communities.
arXiv Detail & Related papers (2021-05-03T16:36:27Z) - Robustly Learning Mixtures of $k$ Arbitrary Gaussians [47.40835932474677]
We give a-time algorithm for the problem of robustly estimating a mixture of $k$ arbitrary Gaussians in $mathbbRd$, for any fixed $k$, in the presence of a constant fraction of arbitrary corruptions.
Our main tools are an efficient emphpartial clustering algorithm that relies on the sum-of-squares method, and a novel tensor decomposition algorithm that allows errors in both Frobenius norm and low-rank terms.
arXiv Detail & Related papers (2020-12-03T17:54:03Z) - Self-regularizing Property of Nonparametric Maximum Likelihood Estimator
in Mixture Models [39.27013036481509]
We introduce the nonparametric maximum likelihood (NPMLE) model for general Gaussian mixtures.
We show that with high probability the NPMLE based on a sample size has $O(log n)$ atoms (mass points)
Notably, any mixture is statistically in from a finite one with $Olog selection.
arXiv Detail & Related papers (2020-08-19T03:39:13Z) - Algebraic and Analytic Approaches for Parameter Learning in Mixture
Models [66.96778152993858]
We present two different approaches for parameter learning in several mixture models in one dimension.
For some of these distributions, our results represent the first guarantees for parameter estimation.
arXiv Detail & Related papers (2020-01-19T05:10:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.