MHCCL: Masked Hierarchical Cluster-Wise Contrastive Learning for
Multivariate Time Series
- URL: http://arxiv.org/abs/2212.01141v4
- Date: Thu, 30 Mar 2023 07:11:56 GMT
- Title: MHCCL: Masked Hierarchical Cluster-Wise Contrastive Learning for
Multivariate Time Series
- Authors: Qianwen Meng, Hangwei Qian, Yong Liu, Lizhen Cui, Yonghui Xu, Zhiqi
Shen
- Abstract summary: Masked Hierarchical Cluster-wise Contrastive Learning model is presented.
It exploits semantic information obtained from the hierarchical structure consisting of multiple latent partitions for time series.
It is shown to be superior to state-of-the-art approaches for unsupervised time series representation learning.
- Score: 20.008535430484475
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning semantic-rich representations from raw unlabeled time series data is
critical for downstream tasks such as classification and forecasting.
Contrastive learning has recently shown its promising representation learning
capability in the absence of expert annotations. However, existing contrastive
approaches generally treat each instance independently, which leads to false
negative pairs that share the same semantics. To tackle this problem, we
propose MHCCL, a Masked Hierarchical Cluster-wise Contrastive Learning model,
which exploits semantic information obtained from the hierarchical structure
consisting of multiple latent partitions for multivariate time series.
Motivated by the observation that fine-grained clustering preserves higher
purity while coarse-grained one reflects higher-level semantics, we propose a
novel downward masking strategy to filter out fake negatives and supplement
positives by incorporating the multi-granularity information from the
clustering hierarchy. In addition, a novel upward masking strategy is designed
in MHCCL to remove outliers of clusters at each partition to refine prototypes,
which helps speed up the hierarchical clustering process and improves the
clustering quality. We conduct experimental evaluations on seven widely-used
multivariate time series datasets. The results demonstrate the superiority of
MHCCL over the state-of-the-art approaches for unsupervised time series
representation learning.
Related papers
- Dynamic Contrastive Learning for Time Series Representation [6.086030037869592]
We propose DynaCL, an unsupervised contrastive representation learning framework for time series.
We demonstrate that DynaCL embeds instances from time series into semantically meaningful clusters.
Our findings also reveal that high scores on unsupervised clustering metrics do not guarantee that the representations are useful in downstream tasks.
arXiv Detail & Related papers (2024-10-20T15:20:24Z) - Dual-disentangled Deep Multiple Clustering [4.040415173465543]
We propose a novel Dual-Disentangled deep Multiple Clustering method named DDMC by learning disentangled representations.
Our experiments demonstrate that DDMC consistently outperforms state-of-the-art methods across seven commonly used tasks.
arXiv Detail & Related papers (2024-02-07T23:05:30Z) - Towards Generalized Multi-stage Clustering: Multi-view Self-distillation [10.368796552760571]
Existing multi-stage clustering methods independently learn the salient features from multiple views and then perform the clustering task.
This paper proposes a novel multi-stage deep MVC framework where multi-view self-distillation (DistilMVC) is introduced to distill dark knowledge of label distribution.
arXiv Detail & Related papers (2023-10-29T03:35:34Z) - Instance-Optimal Cluster Recovery in the Labeled Stochastic Block Model [79.46465138631592]
We devise an efficient algorithm that recovers clusters using the observed labels.
We present Instance-Adaptive Clustering (IAC), the first algorithm whose performance matches these lower bounds both in expectation and with high probability.
arXiv Detail & Related papers (2023-06-18T08:46:06Z) - Deep Multiview Clustering by Contrasting Cluster Assignments [14.767319805995543]
Multiview clustering aims to reveal the underlying structure of multiview data by categorizing data samples into clusters.
We propose a cross-view contrastive learning (C) method that learns view-invariant representations and produces clustering results by contrasting the cluster assignments among multiple views.
arXiv Detail & Related papers (2023-04-21T06:35:54Z) - Deep Temporal Contrastive Clustering [21.660509622172274]
This paper presents a deep temporal contrastive clustering approach.
It incorporates the contrastive learning paradigm into the deep time series clustering research.
Experiments on a variety of time series datasets demonstrate the superiority of our approach over the state-of-the-art.
arXiv Detail & Related papers (2022-12-29T16:43:34Z) - Unified Multi-View Orthonormal Non-Negative Graph Based Clustering
Framework [74.25493157757943]
We formulate a novel clustering model, which exploits the non-negative feature property and incorporates the multi-view information into a unified joint learning framework.
We also explore, for the first time, the multi-model non-negative graph-based approach to clustering data based on deep features.
arXiv Detail & Related papers (2022-11-03T08:18:27Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - You Never Cluster Alone [150.94921340034688]
We extend the mainstream contrastive learning paradigm to a cluster-level scheme, where all the data subjected to the same cluster contribute to a unified representation.
We define a set of categorical variables as clustering assignment confidence, which links the instance-level learning track with the cluster-level one.
By reparametrizing the assignment variables, TCC is trained end-to-end, requiring no alternating steps.
arXiv Detail & Related papers (2021-06-03T14:59:59Z) - Graph Contrastive Clustering [131.67881457114316]
We propose a novel graph contrastive learning framework, which is then applied to the clustering task and we come up with the Graph Constrastive Clustering(GCC) method.
Specifically, on the one hand, the graph Laplacian based contrastive loss is proposed to learn more discriminative and clustering-friendly features.
On the other hand, a novel graph-based contrastive learning strategy is proposed to learn more compact clustering assignments.
arXiv Detail & Related papers (2021-04-03T15:32:49Z) - Unsupervised Multi-view Clustering by Squeezing Hybrid Knowledge from
Cross View and Each View [68.88732535086338]
This paper proposes a new multi-view clustering method, low-rank subspace multi-view clustering based on adaptive graph regularization.
Experimental results for five widely used multi-view benchmarks show that our proposed algorithm surpasses other state-of-the-art methods by a clear margin.
arXiv Detail & Related papers (2020-08-23T08:25:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.