Unsupervised clustering of series using dynamic programming and neural
processes
- URL: http://arxiv.org/abs/2101.10983v1
- Date: Tue, 26 Jan 2021 18:17:10 GMT
- Title: Unsupervised clustering of series using dynamic programming and neural
processes
- Authors: Karthigan Sinnathamby, Chang-Yu Hou, Lalitha Venkataramanan,
Vasileios-Marios Gkortsas, Fran\c{c}ois Fleuret
- Abstract summary: We would like to segment and cluster a series such that the resulting blocks present in each cluster are coherent with respect to a predefined model structure.
It is useful to establish a general framework that enables the integration of plausible models and also accommodates data-driven approach into one approximated model to assist the clustering task.
In this work, we investigate the use of neural processes to build the approximated model while yielding the same assumptions required by the algorithm presented in arXiv:2101.09512.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Following the work of arXiv:2101.09512, we are interested in clustering a
given multi-variate series in an unsupervised manner. We would like to segment
and cluster the series such that the resulting blocks present in each cluster
are coherent with respect to a predefined model structure (e.g. a physics model
with a functional form defined by a number of parameters). However, such
approach might have its limitation, partly because there may exist multiple
models that describe the same data, and partly because the exact model behind
the data may not immediately known. Hence, it is useful to establish a general
framework that enables the integration of plausible models and also
accommodates data-driven approach into one approximated model to assist the
clustering task. Hence, in this work, we investigate the use of neural
processes to build the approximated model while yielding the same assumptions
required by the algorithm presented in arXiv:2101.09512.
Related papers
- Embedding-based statistical inference on generative models [10.948308354932639]
We extend results related to embedding-based representations of generative models to classical statistical inference settings.
We demonstrate that using the perspective space as the basis of a notion of "similar" is effective for multiple model-level inference tasks.
arXiv Detail & Related papers (2024-10-01T22:28:39Z) - Fusion of Gaussian Processes Predictions with Monte Carlo Sampling [61.31380086717422]
In science and engineering, we often work with models designed for accurate prediction of variables of interest.
Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes.
arXiv Detail & Related papers (2024-03-03T04:21:21Z) - A parallelizable model-based approach for marginal and multivariate
clustering [0.0]
This paper develops a clustering method that takes advantage of the sturdiness of model-based clustering.
We tackle this issue by specifying a finite mixture model per margin that allows each margin to have a different number of clusters.
The proposed approach is computationally appealing as well as more tractable for moderate to high dimensions than a full' (joint) model-based clustering approach.
arXiv Detail & Related papers (2022-12-07T23:54:41Z) - Unified Multi-View Orthonormal Non-Negative Graph Based Clustering
Framework [74.25493157757943]
We formulate a novel clustering model, which exploits the non-negative feature property and incorporates the multi-view information into a unified joint learning framework.
We also explore, for the first time, the multi-model non-negative graph-based approach to clustering data based on deep features.
arXiv Detail & Related papers (2022-11-03T08:18:27Z) - Time Series Clustering with an EM algorithm for Mixtures of Linear
Gaussian State Space Models [0.0]
We propose a novel model-based time series clustering method with mixtures of linear Gaussian state space models.
The proposed method uses a new expectation-maximization algorithm for the mixture model to estimate the model parameters.
Experiments on a simulated dataset demonstrate the effectiveness of the method in clustering, parameter estimation, and model selection.
arXiv Detail & Related papers (2022-08-25T07:41:23Z) - Low-Rank Constraints for Fast Inference in Structured Models [110.38427965904266]
This work demonstrates a simple approach to reduce the computational and memory complexity of a large class of structured models.
Experiments with neural parameterized structured models for language modeling, polyphonic music modeling, unsupervised grammar induction, and video modeling show that our approach matches the accuracy of standard models at large state spaces.
arXiv Detail & Related papers (2022-01-08T00:47:50Z) - An Ample Approach to Data and Modeling [1.0152838128195467]
We describe a framework for modeling how models can be built that integrates concepts and methods from a wide range of fields.
The reference M* meta model framework is presented, which relies critically in associating whole datasets and respective models in terms of a strict equivalence relation.
Several considerations about how the developed framework can provide insights about data clustering, complexity, collaborative research, deep learning, and creativity are then presented.
arXiv Detail & Related papers (2021-10-05T01:26:09Z) - Model-agnostic multi-objective approach for the evolutionary discovery
of mathematical models [55.41644538483948]
In modern data science, it is more interesting to understand the properties of the model, which parts could be replaced to obtain better results.
We use multi-objective evolutionary optimization for composite data-driven model learning to obtain the algorithm's desired properties.
arXiv Detail & Related papers (2021-07-07T11:17:09Z) - Unsupervised clustering of series using dynamic programming [0.0]
We would like to segment and cluster the series such that the resulting blocks present in each cluster are coherent with respect to a known model.
Data points are said to be coherent if they can be described using this model with the same parameters.
We have designed an algorithm based on dynamic programming with constraints on the number of clusters, the number of transitions as well as the minimal size of a block.
arXiv Detail & Related papers (2021-01-23T14:35:35Z) - Kernel learning approaches for summarising and combining posterior
similarity matrices [68.8204255655161]
We build upon the notion of the posterior similarity matrix (PSM) in order to suggest new approaches for summarising the output of MCMC algorithms for Bayesian clustering models.
A key contribution of our work is the observation that PSMs are positive semi-definite, and hence can be used to define probabilistically-motivated kernel matrices.
arXiv Detail & Related papers (2020-09-27T14:16:14Z) - Bayesian Sparse Factor Analysis with Kernelized Observations [67.60224656603823]
Multi-view problems can be faced with latent variable models.
High-dimensionality and non-linear issues are traditionally handled by kernel methods.
We propose merging both approaches into single model.
arXiv Detail & Related papers (2020-06-01T14:25:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.