On the properties of Gaussian Copula Mixture Models
- URL: http://arxiv.org/abs/2305.01479v2
- Date: Wed, 24 May 2023 01:41:38 GMT
- Title: On the properties of Gaussian Copula Mixture Models
- Authors: Ke Wan, Alain Kornhauser
- Abstract summary: The paper presents the mathematical definition of GCMM and explores the properties of its likelihood function.
The paper proposes extended Expectation algorithms to estimate parameters for the mixture of copulas.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This paper investigates Gaussian copula mixture models (GCMM), which are an
extension of Gaussian mixture models (GMM) that incorporate copula concepts.
The paper presents the mathematical definition of GCMM and explores the
properties of its likelihood function. Additionally, the paper proposes
extended Expectation Maximum algorithms to estimate parameters for the mixture
of copulas. The marginal distributions corresponding to each component are
estimated separately using nonparametric statistical methods. In the
experiment, GCMM demonstrates improved goodness-of-fitting compared to GMM when
using the same number of clusters. Furthermore, GCMM has the ability to
leverage un-synchronized data across dimensions for more comprehensive data
analysis.
Related papers
- Adaptive Fuzzy C-Means with Graph Embedding [84.47075244116782]
Fuzzy clustering algorithms can be roughly categorized into two main groups: Fuzzy C-Means (FCM) based methods and mixture model based methods.
We propose a novel FCM based clustering model that is capable of automatically learning an appropriate membership degree hyper- parameter value.
arXiv Detail & Related papers (2024-05-22T08:15:50Z) - Finite Mixtures of Multivariate Poisson-Log Normal Factor Analyzers for
Clustering Count Data [0.8499685241219366]
A class of eight parsimonious mixture models based on the mixtures of factor analyzers model are introduced.
The proposed models are explored in the context of clustering discrete data arising from RNA sequencing studies.
arXiv Detail & Related papers (2023-11-13T21:23:15Z) - Heterogeneous Multi-Task Gaussian Cox Processes [61.67344039414193]
We present a novel extension of multi-task Gaussian Cox processes for modeling heterogeneous correlated tasks jointly.
A MOGP prior over the parameters of the dedicated likelihoods for classification, regression and point process tasks can facilitate sharing of information between heterogeneous tasks.
We derive a mean-field approximation to realize closed-form iterative updates for estimating model parameters.
arXiv Detail & Related papers (2023-08-29T15:01:01Z) - A Robust and Flexible EM Algorithm for Mixtures of Elliptical
Distributions with Missing Data [71.9573352891936]
This paper tackles the problem of missing data imputation for noisy and non-Gaussian data.
A new EM algorithm is investigated for mixtures of elliptical distributions with the property of handling potential missing data.
Experimental results on synthetic data demonstrate that the proposed algorithm is robust to outliers and can be used with non-Gaussian data.
arXiv Detail & Related papers (2022-01-28T10:01:37Z) - Image Modeling with Deep Convolutional Gaussian Mixture Models [79.0660895390689]
We present a new formulation of deep hierarchical Gaussian Mixture Models (GMMs) that is suitable for describing and generating images.
DCGMMs avoid this by a stacked architecture of multiple GMM layers, linked by convolution and pooling operations.
For generating sharp images with DCGMMs, we introduce a new gradient-based technique for sampling through non-invertible operations like convolution and pooling.
Based on the MNIST and FashionMNIST datasets, we validate the DCGMMs model by demonstrating its superiority over flat GMMs for clustering, sampling and outlier detection.
arXiv Detail & Related papers (2021-04-19T12:08:53Z) - EGMM: an Evidential Version of the Gaussian Mixture Model for Clustering [22.586481334904793]
We propose a new model-based clustering algorithm, called EGMM (evidential GMM), in the theoretical framework of belief functions.
The parameters in EGMM are estimated by a specially designed Expectation-Maximization (EM) algorithm.
The proposed EGMM is as simple as the classical GMM, but can generate a more informative evidential partition for the considered dataset.
arXiv Detail & Related papers (2020-10-03T11:59:07Z) - Kernel learning approaches for summarising and combining posterior
similarity matrices [68.8204255655161]
We build upon the notion of the posterior similarity matrix (PSM) in order to suggest new approaches for summarising the output of MCMC algorithms for Bayesian clustering models.
A key contribution of our work is the observation that PSMs are positive semi-definite, and hence can be used to define probabilistically-motivated kernel matrices.
arXiv Detail & Related papers (2020-09-27T14:16:14Z) - Clustering of non-Gaussian data by variational Bayes for normal inverse
Gaussian mixture models [0.0]
In practical situations, there are many non-Gaussian data that are heavy-tailed and/or asymmetric.
For NIG mixture models, both expectation-maximization method and variational Bayesian (VB) algorithms have been proposed.
We propose another VB algorithm for NIG mixture that improves on the shortcomings.
We also propose an extension of Dirichlet process mixture models to overcome the difficulty in determining the number of clusters.
arXiv Detail & Related papers (2020-09-13T14:13:27Z) - Consistent Estimation of Identifiable Nonparametric Mixture Models from
Grouped Observations [84.81435917024983]
This work proposes an algorithm that consistently estimates any identifiable mixture model from grouped observations.
A practical implementation is provided for paired observations, and the approach is shown to outperform existing methods.
arXiv Detail & Related papers (2020-06-12T20:44:22Z) - Handling missing data in model-based clustering [0.0]
We propose two methods to fit Gaussian mixtures in the presence of missing data.
Both methods use a variant of the Monte Carlo Expectation-Maximisation algorithm for data augmentation.
We show that the proposed methods outperform the multiple imputation approach, both in terms of clusters identification and density estimation.
arXiv Detail & Related papers (2020-06-04T15:36:31Z) - Projection pursuit based on Gaussian mixtures and evolutionary
algorithms [0.0]
We propose a projection pursuit (PP) algorithm based on Gaussian mixture models (GMMs)
We show that this semi-parametric approach to PP is flexible and allows highly informative structures to be detected.
The performance of the proposed approach is shown on both artificial and real datasets.
arXiv Detail & Related papers (2019-12-27T10:25:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.