Tensor decomposition for learning Gaussian mixtures from moments
- URL: http://arxiv.org/abs/2106.00555v1
- Date: Tue, 1 Jun 2021 15:11:08 GMT
- Title: Tensor decomposition for learning Gaussian mixtures from moments
- Authors: Rima Khouja (AROMATH), Pierre-Alexandre Mattei (MAASAI), Bernard
Mourrain (AROMATH)
- Abstract summary: In data processing and machine learning, an important challenge is to recover and exploit models that can represent accurately the data.
We investigate symmetric tensor decomposition methods for tackling this problem, where the tensor is built from empirical moments of the data distribution.
- Score: 6.576993289263191
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In data processing and machine learning, an important challenge is to recover
and exploit models that can represent accurately the data. We consider the
problem of recovering Gaussian mixture models from datasets. We investigate
symmetric tensor decomposition methods for tackling this problem, where the
tensor is built from empirical moments of the data distribution. We consider
identifiable tensors, which have a unique decomposition, showing that moment
tensors built from spherical Gaussian mixtures have this property. We prove
that symmetric tensors with interpolation degree strictly less than half their
order are identifiable and we present an algorithm, based on simple linear
algebra operations, to compute their decomposition. Illustrative
experimentations show the impact of the tensor decomposition method for
recovering Gaussian mixtures, in comparison with other state-of-the-art
approaches.
Related papers
- Provable Tensor Completion with Graph Information [49.08648842312456]
We introduce a novel model, theory, and algorithm for solving the dynamic graph regularized tensor completion problem.
We develop a comprehensive model simultaneously capturing the low-rank and similarity structure of the tensor.
In terms of theory, we showcase the alignment between the proposed graph smoothness regularization and a weighted tensor nuclear norm.
arXiv Detail & Related papers (2023-10-04T02:55:10Z) - Moment Estimation for Nonparametric Mixture Models Through Implicit
Tensor Decomposition [7.139680863764187]
We present an alternating least squares type numerical optimization scheme to estimate conditionally-independent mixture models in $mathbbRn$.
We compute the cumulative distribution functions, higher moments and other statistics of the component distributions through linear solves.
Numerical experiments demonstrate the competitive performance of the algorithm, and its applicability to many models and applications.
arXiv Detail & Related papers (2022-10-25T23:31:33Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - Equivariance Discovery by Learned Parameter-Sharing [153.41877129746223]
We study how to discover interpretable equivariances from data.
Specifically, we formulate this discovery process as an optimization problem over a model's parameter-sharing schemes.
Also, we theoretically analyze the method for Gaussian data and provide a bound on the mean squared gap between the studied discovery scheme and the oracle scheme.
arXiv Detail & Related papers (2022-04-07T17:59:19Z) - A Robust and Flexible EM Algorithm for Mixtures of Elliptical
Distributions with Missing Data [71.9573352891936]
This paper tackles the problem of missing data imputation for noisy and non-Gaussian data.
A new EM algorithm is investigated for mixtures of elliptical distributions with the property of handling potential missing data.
Experimental results on synthetic data demonstrate that the proposed algorithm is robust to outliers and can be used with non-Gaussian data.
arXiv Detail & Related papers (2022-01-28T10:01:37Z) - Schema matching using Gaussian mixture models with Wasserstein distance [0.2676349883103403]
We derive approximations for the Wasserstein distance between Gaussian mixture models and reduce it to linear problem.
In this paper we derive one of possible approximations for the Wasserstein distance between Gaussian mixture models and reduce it to linear problem.
arXiv Detail & Related papers (2021-11-28T21:44:58Z) - Nonlinear Independent Component Analysis for Continuous-Time Signals [85.59763606620938]
We study the classical problem of recovering a multidimensional source process from observations of mixtures of this process.
We show that this recovery is possible for many popular models of processes (up to order and monotone scaling of their coordinates) if the mixture is given by a sufficiently differentiable, invertible function.
arXiv Detail & Related papers (2021-02-04T20:28:44Z) - Alternating linear scheme in a Bayesian framework for low-rank tensor
approximation [5.833272638548154]
We find a low-rank representation for a given tensor by solving a Bayesian inference problem.
We present an algorithm that performs the unscented transform in tensor train format.
arXiv Detail & Related papers (2020-12-21T10:15:30Z) - Low-Rank and Sparse Enhanced Tucker Decomposition for Tensor Completion [3.498620439731324]
We introduce a unified low-rank and sparse enhanced Tucker decomposition model for tensor completion.
Our model possesses a sparse regularization term to promote a sparse core tensor, which is beneficial for tensor data compression.
It is remarkable that our model is able to deal with different types of real-world data sets, since it exploits the potential periodicity and inherent correlation properties appeared in tensors.
arXiv Detail & Related papers (2020-10-01T12:45:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.