Sparse Subspace Clustering in Diverse Multiplex Network Model
- URL: http://arxiv.org/abs/2206.07602v2
- Date: Tue, 25 Apr 2023 16:44:18 GMT
- Title: Sparse Subspace Clustering in Diverse Multiplex Network Model
- Authors: Majid Noroozi and Marianna Pensky
- Abstract summary: The paper considers the DIverse MultiPLEx (DIMPLE) network model, where all layers of the network have the same collection of nodes and are equipped with the Block Models.
The DIMPLE model generalizes a multitude of papers that study multilayer networks with the same community structures in all layers.
The present paper uses Sparse Subspace Clustering (SSC) for identifying groups of layers with identical community structures.
- Score: 4.56877715768796
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The paper considers the DIverse MultiPLEx (DIMPLE) network model, introduced
in Pensky and Wang (2021), where all layers of the network have the same
collection of nodes and are equipped with the Stochastic Block Models. In
addition, all layers can be partitioned into groups with the same community
structures, although the layers in the same group may have different matrices
of block connection probabilities. The DIMPLE model generalizes a multitude of
papers that study multilayer networks with the same community structures in all
layers, as well as the Mixture Multilayer Stochastic Block Model (MMLSBM),
where the layers in the same group have identical matrices of block connection
probabilities. While Pensky and Wang (2021) applied spectral clustering to the
proxy of the adjacency tensor, the present paper uses Sparse Subspace
Clustering (SSC) for identifying groups of layers with identical community
structures. Under mild conditions, the latter leads to the strongly consistent
between-layer clustering. In addition, SSC allows to handle much larger
networks than methodology of Pensky and Wang (2021), and is perfectly suitable
for application of parallel computing.
Related papers
- Signed Diverse Multiplex Networks: Clustering and Inference [4.070200285321219]
The setting is extended to a multiplex version, where all layers have the same collection of nodes and follow the SGRDPG.
The paper fulfills two objectives. First, it shows that keeping signs of the edges in the process of network construction leads to a better precision of estimation and clustering.
Second, by employing novel algorithms, our paper ensures strongly consistent clustering of layers and high accuracy of subspace estimation.
arXiv Detail & Related papers (2024-02-14T19:37:30Z) - Instance-Optimal Cluster Recovery in the Labeled Stochastic Block Model [79.46465138631592]
We devise an efficient algorithm that recovers clusters using the observed labels.
We present Instance-Adaptive Clustering (IAC), the first algorithm whose performance matches these lower bounds both in expectation and with high probability.
arXiv Detail & Related papers (2023-06-18T08:46:06Z) - Deep Image Clustering with Contrastive Learning and Multi-scale Graph
Convolutional Networks [58.868899595936476]
This paper presents a new deep clustering approach termed image clustering with contrastive learning and multi-scale graph convolutional networks (IcicleGCN)
Experiments on multiple image datasets demonstrate the superior clustering performance of IcicleGCN over the state-of-the-art.
arXiv Detail & Related papers (2022-07-14T19:16:56Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - Linear Connectivity Reveals Generalization Strategies [54.947772002394736]
Some pairs of finetuned models have large barriers of increasing loss on the linear paths between them.
We find distinct clusters of models which are linearly connected on the test loss surface, but are disconnected from models outside the cluster.
Our work demonstrates how the geometry of the loss surface can guide models towards different functions.
arXiv Detail & Related papers (2022-05-24T23:43:02Z) - Community detection in multiplex networks based on orthogonal
nonnegative matrix tri-factorization [26.53951886710295]
We introduce a new multiplex community detection approach that can identify communities that are common across layers as well as those that are unique to each layer.
The proposed algorithm is evaluated on both synthetic and real multiplex networks and compared to state-of-the-art techniques.
arXiv Detail & Related papers (2022-05-02T02:33:15Z) - Attention-driven Graph Clustering Network [49.040136530379094]
We propose a novel deep clustering method named Attention-driven Graph Clustering Network (AGCN)
AGCN exploits a heterogeneous-wise fusion module to dynamically fuse the node attribute feature and the topological graph feature.
AGCN can jointly perform feature learning and cluster assignment in an unsupervised fashion.
arXiv Detail & Related papers (2021-08-12T02:30:38Z) - ALMA: Alternating Minimization Algorithm for Clustering Mixture
Multilayer Network [20.888592224540748]
The goal is to partition the multilayer network into clusters of similar layers, and to identify communities in those layers.
The present paper proposes a different technique, an alternating minimization algorithm (ALMA) that aims at simultaneous recovery of the layer partition.
arXiv Detail & Related papers (2021-02-20T01:26:55Z) - Global and Individualized Community Detection in Inhomogeneous
Multilayer Networks [14.191073951237772]
In network applications, it has become increasingly common to obtain datasets in the form of multiple networks observed on the same set of subjects.
Such datasets can be modeled by multilayer networks where each layer is a separate network itself while different layers are associated and share some common information.
The present paper studies community detection in a stylized yet informative inhomogeneous multilayer network model.
arXiv Detail & Related papers (2020-12-02T02:42:52Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z) - Consistency of Spectral Clustering on Hierarchical Stochastic Block
Models [5.983753938303726]
We study the hierarchy of communities in real-world networks under a generic block model.
We prove the strong consistency of this method under a wide range of model parameters.
Unlike most of existing work, our theory covers multiscale networks where the connection probabilities may differ by orders of magnitude.
arXiv Detail & Related papers (2020-04-30T01:08:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.