Effect of The Latent Structure on Clustering with GANs
- URL: http://arxiv.org/abs/2005.02435v1
- Date: Tue, 5 May 2020 18:52:49 GMT
- Title: Effect of The Latent Structure on Clustering with GANs
- Authors: Deepak Mishra, Aravind Jayendran, Prathosh A. P
- Abstract summary: We focus on the problem of clustering in generated space of GANs and uncover its relationship with the characteristics of the latent space.
We derive from first principles, the necessary and sufficient conditions needed to achieve faithful clustering in the GAN framework.
We also describe a procedure to construct a multimodal latent space which facilitates learning of cluster priors with sparse supervision.
- Score: 13.970914037707724
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative adversarial networks (GANs) have shown remarkable success in
generation of data from natural data manifolds such as images. In several
scenarios, it is desirable that generated data is well-clustered, especially
when there is severe class imbalance. In this paper, we focus on the problem of
clustering in generated space of GANs and uncover its relationship with the
characteristics of the latent space. We derive from first principles, the
necessary and sufficient conditions needed to achieve faithful clustering in
the GAN framework: (i) presence of a multimodal latent space with adjustable
priors, (ii) existence of a latent space inversion mechanism and (iii)
imposition of the desired cluster priors on the latent space. We also identify
the GAN models in the literature that partially satisfy these conditions and
demonstrate the importance of all the components required, through ablative
studies on multiple real world image datasets. Additionally, we describe a
procedure to construct a multimodal latent space which facilitates learning of
cluster priors with sparse supervision.
Related papers
- Controllable diffusion-based generation for multi-channel biological data [66.44042377817074]
This work proposes a unified diffusion framework for controllable generation over structured and spatial biological data.<n>We show state-of-the-art performance across both spatial and non-spatial prediction tasks, including protein imputation in IMC and gene-to-protein prediction in single-cell datasets.
arXiv Detail & Related papers (2025-06-24T00:56:21Z) - Scalable Context-Preserving Model-Aware Deep Clustering for Hyperspectral Images [51.95768218975529]
Subspace clustering has become widely adopted for the unsupervised analysis of hyperspectral images (HSIs)<n>Recent model-aware deep subspace clustering methods often use a two-stage framework, involving the calculation of a self-representation matrix with complexity of O(n2), followed by spectral clustering.<n>We propose a scalable, context-preserving deep clustering method based on basis representation, which jointly captures local and non-local structures for efficient HSI clustering.
arXiv Detail & Related papers (2025-06-12T16:43:09Z) - GCC: Generative Calibration Clustering [55.44944397168619]
We propose a novel Generative Clustering (GCC) method to incorporate feature learning and augmentation into clustering procedure.
First, we develop a discrimirative feature alignment mechanism to discover intrinsic relationship across real and generated samples.
Second, we design a self-supervised metric learning to generate more reliable cluster assignment.
arXiv Detail & Related papers (2024-04-14T01:51:11Z) - Distributional Reduction: Unifying Dimensionality Reduction and Clustering with Gromov-Wasserstein [56.62376364594194]
Unsupervised learning aims to capture the underlying structure of potentially large and high-dimensional datasets.
In this work, we revisit these approaches under the lens of optimal transport and exhibit relationships with the Gromov-Wasserstein problem.
This unveils a new general framework, called distributional reduction, that recovers DR and clustering as special cases and allows addressing them jointly within a single optimization problem.
arXiv Detail & Related papers (2024-02-03T19:00:19Z) - Hyper-Laplacian Regularized Concept Factorization in Low-rank Tensor
Space for Multi-view Clustering [0.0]
We propose a hyper-Laplacian regularized concept factorization (HLRCF) in low-rank tensor space for multi-view clustering.
Specifically, we adopt the concept factorization to explore the latent cluster-wise representation of each view.
Considering that different tensor singular values associate structural information with unequal importance, we develop a self-weighted tensor Schatten p-norm.
arXiv Detail & Related papers (2023-04-22T15:46:58Z) - Multi-View Clustering via Semi-non-negative Tensor Factorization [120.87318230985653]
We develop a novel multi-view clustering based on semi-non-negative tensor factorization (Semi-NTF)
Our model directly considers the between-view relationship and exploits the between-view complementary information.
In addition, we provide an optimization algorithm for the proposed method and prove mathematically that the algorithm always converges to the stationary KKT point.
arXiv Detail & Related papers (2023-03-29T14:54:19Z) - Adaptively-weighted Integral Space for Fast Multiview Clustering [54.177846260063966]
We propose an Adaptively-weighted Integral Space for Fast Multiview Clustering (AIMC) with nearly linear complexity.
Specifically, view generation models are designed to reconstruct the view observations from the latent integral space.
Experiments conducted on several realworld datasets confirm the superiority of the proposed AIMC method.
arXiv Detail & Related papers (2022-08-25T05:47:39Z) - Enhancing cluster analysis via topological manifold learning [0.3823356975862006]
We show that inferring the topological structure of a dataset before clustering can considerably enhance cluster detection.
We combine manifold learning method UMAP for inferring the topological structure with density-based clustering method DBSCAN.
arXiv Detail & Related papers (2022-07-01T15:53:39Z) - Deep Conditional Gaussian Mixture Model for Constrained Clustering [7.070883800886882]
Constrained clustering can leverage prior information on a growing amount of only partially labeled data.
We propose a novel framework for constrained clustering that is intuitive, interpretable, and can be trained efficiently in the framework of gradient variational inference.
arXiv Detail & Related papers (2021-06-11T13:38:09Z) - Spatial-Spectral Clustering with Anchor Graph for Hyperspectral Image [88.60285937702304]
This paper proposes a novel unsupervised approach called spatial-spectral clustering with anchor graph (SSCAG) for HSI data clustering.
The proposed SSCAG is competitive against the state-of-the-art approaches.
arXiv Detail & Related papers (2021-04-24T08:09:27Z) - Unveiling the Potential of Structure-Preserving for Weakly Supervised
Object Localization [71.79436685992128]
We propose a two-stage approach, termed structure-preserving activation (SPA), towards fully leveraging the structure information incorporated in convolutional features for WSOL.
In the first stage, a restricted activation module (RAM) is designed to alleviate the structure-missing issue caused by the classification network.
In the second stage, we propose a post-process approach, termed self-correlation map generating (SCG) module to obtain structure-preserving localization maps.
arXiv Detail & Related papers (2021-03-08T03:04:14Z) - A Critique of Self-Expressive Deep Subspace Clustering [23.971512395191308]
Subspace clustering is an unsupervised clustering technique designed to cluster data that is supported on a union of linear subspaces.
We show that there are a number of potential flaws with this approach which have not been adequately addressed in prior work.
arXiv Detail & Related papers (2020-10-08T00:14:59Z) - Graph Convolutional Subspace Clustering: A Robust Subspace Clustering
Framework for Hyperspectral Image [6.332208511335129]
We present a novel subspace clustering framework called Graph Convolutional Subspace Clustering (GCSC) for robust HSI clustering.
Specifically, the framework recasts the self-expressiveness property of the data into the non-Euclidean domain.
We show that traditional subspace clustering models are the special forms of our framework with the Euclidean data.
arXiv Detail & Related papers (2020-04-22T10:09:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.