ALMA: Alternating Minimization Algorithm for Clustering Mixture
Multilayer Network
- URL: http://arxiv.org/abs/2102.10226v1
- Date: Sat, 20 Feb 2021 01:26:55 GMT
- Title: ALMA: Alternating Minimization Algorithm for Clustering Mixture
Multilayer Network
- Authors: Xing Fan, Marianna Pensky, Feng Yu, Teng Zhang
- Abstract summary: The goal is to partition the multilayer network into clusters of similar layers, and to identify communities in those layers.
The present paper proposes a different technique, an alternating minimization algorithm (ALMA) that aims at simultaneous recovery of the layer partition.
- Score: 20.888592224540748
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The paper considers a Mixture Multilayer Stochastic Block Model (MMLSBM),
where layers can be partitioned into groups of similar networks, and networks
in each group are equipped with a distinct Stochastic Block Model. The goal is
to partition the multilayer network into clusters of similar layers, and to
identify communities in those layers. Jing et al. (2020) introduced the MMLSBM
and developed a clustering methodology, TWIST, based on regularized tensor
decomposition.
The present paper proposes a different technique, an alternating minimization
algorithm (ALMA), that aims at simultaneous recovery of the layer partition,
together with estimation of the matrices of connection probabilities of the
distinct layers. Compared to TWIST, ALMA achieves higher accuracy both
theoretically and numerically.
Related papers
- Community detection by spectral methods in multi-layer networks [0.0]
Community detection in multi-layer networks is a crucial problem in network analysis.
One algorithm is based on the sum of adjacency matrices, while the other utilizes the debiased sum of squared adjacency matrices.
Numerical simulations confirm that our algorithm, employing the debiased sum of squared adjacency matrices, surpasses existing methods for community detection in multi-layer networks.
arXiv Detail & Related papers (2024-03-19T08:29:47Z) - Mixture of multilayer stochastic block models for multiview clustering [0.0]
We propose an original method for aggregating multiple clustering coming from different sources of information.
The identifiability of the model parameters is established and a variational Bayesian EM algorithm is proposed for the estimation of these parameters.
The method is utilized to analyze global food trading networks, leading to structures of interest.
arXiv Detail & Related papers (2024-01-09T17:15:47Z) - Optimal Clustering of Discrete Mixtures: Binomial, Poisson, Block
Models, and Multi-layer Networks [9.57586103097079]
We study the fundamental limit of clustering networks when a multi-layer network is present.
Under the mixture multi-layer block model (MMSBM), we show that the minimax optimal network clustering error rate takes an exponential form.
We propose a novel two-stage network clustering method including a tensor-based algorithm involving both node and sample splitting.
arXiv Detail & Related papers (2023-11-27T07:48:50Z) - Instance-Optimal Cluster Recovery in the Labeled Stochastic Block Model [79.46465138631592]
We devise an efficient algorithm that recovers clusters using the observed labels.
We present Instance-Adaptive Clustering (IAC), the first algorithm whose performance matches these lower bounds both in expectation and with high probability.
arXiv Detail & Related papers (2023-06-18T08:46:06Z) - WLD-Reg: A Data-dependent Within-layer Diversity Regularizer [98.78384185493624]
Neural networks are composed of multiple layers arranged in a hierarchical structure jointly trained with a gradient-based optimization.
We propose to complement this traditional 'between-layer' feedback with additional 'within-layer' feedback to encourage the diversity of the activations within the same layer.
We present an extensive empirical study confirming that the proposed approach enhances the performance of several state-of-the-art neural network models in multiple tasks.
arXiv Detail & Related papers (2023-01-03T20:57:22Z) - Sparse Subspace Clustering in Diverse Multiplex Network Model [4.56877715768796]
The paper considers the DIverse MultiPLEx (DIMPLE) network model, where all layers of the network have the same collection of nodes and are equipped with the Block Models.
The DIMPLE model generalizes a multitude of papers that study multilayer networks with the same community structures in all layers.
The present paper uses Sparse Subspace Clustering (SSC) for identifying groups of layers with identical community structures.
arXiv Detail & Related papers (2022-06-15T15:32:23Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - A new perspective on probabilistic image modeling [92.89846887298852]
We present a new probabilistic approach for image modeling capable of density estimation, sampling and tractable inference.
DCGMMs can be trained end-to-end by SGD from random initial conditions, much like CNNs.
We show that DCGMMs compare favorably to several recent PC and SPN models in terms of inference, classification and sampling.
arXiv Detail & Related papers (2022-03-21T14:53:57Z) - Gated recurrent units and temporal convolutional network for multilabel
classification [122.84638446560663]
This work proposes a new ensemble method for managing multilabel classification.
The core of the proposed approach combines a set of gated recurrent units and temporal convolutional neural networks trained with variants of the Adam gradients optimization approach.
arXiv Detail & Related papers (2021-10-09T00:00:16Z) - Clustering Ensemble Meets Low-rank Tensor Approximation [50.21581880045667]
This paper explores the problem of clustering ensemble, which aims to combine multiple base clusterings to produce better performance than that of the individual one.
We propose a novel low-rank tensor approximation-based method to solve the problem from a global perspective.
Experimental results over 7 benchmark data sets show that the proposed model achieves a breakthrough in clustering performance, compared with 12 state-of-the-art methods.
arXiv Detail & Related papers (2020-12-16T13:01:37Z) - Spectral clustering via adaptive layer aggregation for multi-layer
networks [6.0073653636512585]
We propose integrative spectral clustering approaches based on effective convex layer aggregations.
We show that our methods are remarkably competitive compared to several popularly used methods.
arXiv Detail & Related papers (2020-12-07T21:58:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.