Deep Adversarial Inconsistent Cognitive Sampling for Multi-view
Progressive Subspace Clustering
- URL: http://arxiv.org/abs/2101.03783v2
- Date: Wed, 13 Jan 2021 04:55:26 GMT
- Title: Deep Adversarial Inconsistent Cognitive Sampling for Multi-view
Progressive Subspace Clustering
- Authors: Renhao Sun, Yang Wang, Zhao Zhang, Richang Hong, and Meng Wang
- Abstract summary: We propose a novel Deep Adversarial Inconsistent Cognitive Sampling (DAICS) method for multi-view progressive subspace clustering.
We develop a multi-view cognitive sampling strategy to select the input samples from easy to difficult for multi-view clustering network training.
Experimental results on four real-world datasets demonstrate the superiority of DAICS over the state-of-the-art methods.
- Score: 45.8773004047657
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep multi-view clustering methods have achieved remarkable performance.
However, all of them failed to consider the difficulty labels (uncertainty of
ground-truth for training samples) over multi-view samples, which may result
into a nonideal clustering network for getting stuck into poor local optima
during training process; worse still, the difficulty labels from multi-view
samples are always inconsistent, such fact makes it even more challenging to
handle. In this paper, we propose a novel Deep Adversarial Inconsistent
Cognitive Sampling (DAICS) method for multi-view progressive subspace
clustering. A multiview binary classification (easy or difficult) loss and a
feature similarity loss are proposed to jointly learn a binary classifier and a
deep consistent feature embedding network, throughout an adversarial minimax
game over difficulty labels of multiview consistent samples. We develop a
multi-view cognitive sampling strategy to select the input samples from easy to
difficult for multi-view clustering network training. However, the
distributions of easy and difficult samples are mixed together, hence not
trivial to achieve the goal. To resolve it, we define a sampling probability
with theoretical guarantee. Based on that, a golden section mechanism is
further designed to generate a sample set boundary to progressively select the
samples with varied difficulty labels via a gate unit, which is utilized to
jointly learn a multi-view common progressive subspace and clustering network
for more efficient clustering. Experimental results on four real-world datasets
demonstrate the superiority of DAICS over the state-of-the-art methods.
Related papers
- CDIMC-net: Cognitive Deep Incomplete Multi-view Clustering Network [53.72046586512026]
We propose a novel incomplete multi-view clustering network, called Cognitive Deep Incomplete Multi-view Clustering Network (CDIMC-net)
It captures the high-level features and local structure of each view by incorporating the view-specific deep encoders and graph embedding strategy into a framework.
Based on the human cognition, i.e., learning from easy to hard, it introduces a self-paced strategy to select the most confident samples for model training.
arXiv Detail & Related papers (2024-03-28T15:45:03Z) - Virtual Category Learning: A Semi-Supervised Learning Method for Dense
Prediction with Extremely Limited Labels [63.16824565919966]
This paper proposes to use confusing samples proactively without label correction.
A Virtual Category (VC) is assigned to each confusing sample in such a way that it can safely contribute to the model optimisation.
Our intriguing findings highlight the usage of VC learning in dense vision tasks.
arXiv Detail & Related papers (2023-12-02T16:23:52Z) - Towards Generalized Multi-stage Clustering: Multi-view Self-distillation [10.368796552760571]
Existing multi-stage clustering methods independently learn the salient features from multiple views and then perform the clustering task.
This paper proposes a novel multi-stage deep MVC framework where multi-view self-distillation (DistilMVC) is introduced to distill dark knowledge of label distribution.
arXiv Detail & Related papers (2023-10-29T03:35:34Z) - Tackling Diverse Minorities in Imbalanced Classification [80.78227787608714]
Imbalanced datasets are commonly observed in various real-world applications, presenting significant challenges in training classifiers.
We propose generating synthetic samples iteratively by mixing data samples from both minority and majority classes.
We demonstrate the effectiveness of our proposed framework through extensive experiments conducted on seven publicly available benchmark datasets.
arXiv Detail & Related papers (2023-08-28T18:48:34Z) - Hard Regularization to Prevent Deep Online Clustering Collapse without
Data Augmentation [65.268245109828]
Online deep clustering refers to the joint use of a feature extraction network and a clustering model to assign cluster labels to each new data point or batch as it is processed.
While faster and more versatile than offline methods, online clustering can easily reach the collapsed solution where the encoder maps all inputs to the same point and all are put into a single cluster.
We propose a method that does not require data augmentation, and that, differently from existing methods, regularizes the hard assignments.
arXiv Detail & Related papers (2023-03-29T08:23:26Z) - Deep Multi-View Semi-Supervised Clustering with Sample Pairwise
Constraints [10.226754903113164]
We propose a novel Deep Multi-view Semi-supervised Clustering (DMSC) method, which jointly optimize three kinds of losses during networks finetuning.
We demonstrate that our proposed approach performs better than the state-of-the-art multi-view and single-view competitors.
arXiv Detail & Related papers (2022-06-10T08:51:56Z) - Learning Statistical Representation with Joint Deep Embedded Clustering [2.1267423178232407]
StatDEC is an unsupervised framework for joint statistical representation learning and clustering.
Our experiments show that using these representations, one can considerably improve results on imbalanced image clustering across a variety of image datasets.
arXiv Detail & Related papers (2021-09-11T09:26:52Z) - Rank-Consistency Deep Hashing for Scalable Multi-Label Image Search [90.30623718137244]
We propose a novel deep hashing method for scalable multi-label image search.
A new rank-consistency objective is applied to align the similarity orders from two spaces.
A powerful loss function is designed to penalize the samples whose semantic similarity and hamming distance are mismatched.
arXiv Detail & Related papers (2021-02-02T13:46:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.