Bayesian community detection for networks with covariates
- URL: http://arxiv.org/abs/2203.02090v2
- Date: Thu, 6 Apr 2023 18:57:17 GMT
- Title: Bayesian community detection for networks with covariates
- Authors: Luyi Shen, Arash Amini, Nathaniel Josephs, and Lizhen Lin
- Abstract summary: "Community detection" has arguably received the most attention in the scientific community.
We propose a block model with a co-dependent random partition prior.
Our model has the ability to learn the number of the communities via posterior inference without having to assume it to be known.
- Score: 16.230648949593153
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The increasing prevalence of network data in a vast variety of fields and the
need to extract useful information out of them have spurred fast developments
in related models and algorithms. Among the various learning tasks with network
data, community detection, the discovery of node clusters or "communities," has
arguably received the most attention in the scientific community. In many
real-world applications, the network data often come with additional
information in the form of node or edge covariates that should ideally be
leveraged for inference. In this paper, we add to a limited literature on
community detection for networks with covariates by proposing a Bayesian
stochastic block model with a covariate-dependent random partition prior. Under
our prior, the covariates are explicitly expressed in specifying the prior
distribution on the cluster membership. Our model has the flexibility of
modeling uncertainties of all the parameter estimates including the community
membership. Importantly, and unlike the majority of existing methods, our model
has the ability to learn the number of the communities via posterior inference
without having to assume it to be known. Our model can be applied to community
detection in both dense and sparse networks, with both categorical and
continuous covariates, and our MCMC algorithm is very efficient with good
mixing properties. We demonstrate the superior performance of our model over
existing models in a comprehensive simulation study and an application to two
real datasets.
Related papers
- Sifting out communities in large sparse networks [2.666294200266662]
We introduce an intuitive objective function for quantifying the quality of clustering results in large sparse networks.
We utilize a two-step method for identifying communities which is especially well-suited for this domain.
We identify complex genetic interactions in large-scale networks comprised of tens of thousands of nodes.
arXiv Detail & Related papers (2024-05-01T18:57:41Z) - A stochastic block model for community detection in attributed networks [7.128313939076842]
Existing community detection methods mostly focus on network structure, while the methods of integrating node attributes is mainly for the traditional community structures.
A block model that integrates betweenness centrality and clustering coefficient of nodes for community detection in attributed networks is proposed in this paper.
The performance of this model is superior to other five compared algorithms.
arXiv Detail & Related papers (2023-08-31T01:00:24Z) - A Comprehensive Survey on Community Detection with Deep Learning [93.40332347374712]
A community reveals the features and connections of its members that are different from those in other communities in a network.
This survey devises and proposes a new taxonomy covering different categories of the state-of-the-art methods.
The main category, i.e., deep neural networks, is further divided into convolutional networks, graph attention networks, generative adversarial networks and autoencoders.
arXiv Detail & Related papers (2021-05-26T14:37:07Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Deep Archimedean Copulas [98.96141706464425]
ACNet is a novel differentiable neural network architecture that enforces structural properties.
We show that ACNet is able to both approximate common Archimedean Copulas and generate new copulas which may provide better fits to data.
arXiv Detail & Related papers (2020-12-05T22:58:37Z) - Community models for networks observed through edge nominations [6.442024233731203]
Communities are a common and widely studied structure in networks, typically under the assumption that the network is fully and correctly observed.
We propose a general model for a class of network sampling mechanisms based on recording edges via querying nodes.
We show community detection can be performed by spectral clustering under this general class of models.
arXiv Detail & Related papers (2020-08-09T04:53:13Z) - A Multi-Semantic Metapath Model for Large Scale Heterogeneous Network
Representation Learning [52.83948119677194]
We propose a multi-semantic metapath (MSM) model for large scale heterogeneous representation learning.
Specifically, we generate multi-semantic metapath-based random walks to construct the heterogeneous neighborhood to handle the unbalanced distributions.
We conduct systematical evaluations for the proposed framework on two challenging datasets: Amazon and Alibaba.
arXiv Detail & Related papers (2020-07-19T22:50:20Z) - Extended Stochastic Block Models with Application to Criminal Networks [3.2211782521637393]
We study covert networks that encode relationships among criminals.
The coexistence of noisy block patterns limits the reliability of routinely-used community detection algorithms.
We develop a new class of extended block models (ESBM) that infer groups of nodes having common connectivity patterns.
arXiv Detail & Related papers (2020-07-16T19:06:16Z) - Diversity inducing Information Bottleneck in Model Ensembles [73.80615604822435]
In this paper, we target the problem of generating effective ensembles of neural networks by encouraging diversity in prediction.
We explicitly optimize a diversity inducing adversarial loss for learning latent variables and thereby obtain diversity in the output predictions necessary for modeling multi-modal data.
Compared to the most competitive baselines, we show significant improvements in classification accuracy, under a shift in the data distribution.
arXiv Detail & Related papers (2020-03-10T03:10:41Z) - PushNet: Efficient and Adaptive Neural Message Passing [1.9121961872220468]
Message passing neural networks have recently evolved into a state-of-the-art approach to representation learning on graphs.
Existing methods perform synchronous message passing along all edges in multiple subsequent rounds.
We consider a novel asynchronous message passing approach where information is pushed only along the most relevant edges until convergence.
arXiv Detail & Related papers (2020-03-04T18:15:30Z) - Model Fusion via Optimal Transport [64.13185244219353]
We present a layer-wise model fusion algorithm for neural networks.
We show that this can successfully yield "one-shot" knowledge transfer between neural networks trained on heterogeneous non-i.i.d. data.
arXiv Detail & Related papers (2019-10-12T22:07:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.