Modularity of the ABCD Random Graph Model with Community Structure
- URL: http://arxiv.org/abs/2203.01480v1
- Date: Thu, 3 Mar 2022 01:49:46 GMT
- Title: Modularity of the ABCD Random Graph Model with Community Structure
- Authors: Bogumil Kaminski, Bartosz Pankratz, Pawel Pralat, Francois Theberge
- Abstract summary: The ABCD graph is a random graph model with community structure and power-law distribution for both degrees and community sizes.
We analyze the modularity function, arguably the most important graph property of networks in the context of community detection.
It is also used as a quality function in many community detection algorithms, including the widely used Louvain algorithm.
- Score: 2.580765958706854
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Artificial Benchmark for Community Detection (ABCD) graph is a random
graph model with community structure and power-law distribution for both
degrees and community sizes. The model generates graphs with similar properties
as the well-known LFR one, and its main parameter $\xi$ can be tuned to mimic
its counterpart in the LFR model, the mixing parameter $\mu$.
In this paper, we investigate various theoretical asymptotic properties of
the ABCD model. In particular, we analyze the modularity function, arguably,
the most important graph property of networks in the context of community
detection. Indeed, the modularity function is often used to measure the
presence of community structure in networks. It is also used as a quality
function in many community detection algorithms, including the widely used
Louvain algorithm.
Related papers
- Scalable Weibull Graph Attention Autoencoder for Modeling Document Networks [50.42343781348247]
We develop a graph Poisson factor analysis (GPFA) which provides analytic conditional posteriors to improve the inference accuracy.
We also extend GPFA to a multi-stochastic-layer version named graph Poisson gamma belief network (GPGBN) to capture the hierarchical document relationships at multiple semantic levels.
Our models can extract high-quality hierarchical latent document representations and achieve promising performance on various graph analytic tasks.
arXiv Detail & Related papers (2024-10-13T02:22:14Z) - Artificial Benchmark for Community Detection with Outliers (ABCD+o) [5.8010446129208155]
We extend the ABCD model to include potential outliers.
We perform some exploratory experiments on both the new ABCD+o model as well as a real-world network to show that outliers possess some desired, distinguishable properties.
arXiv Detail & Related papers (2023-01-13T20:14:44Z) - Hypergraph Artificial Benchmark for Community Detection (h-ABCD) [5.8010446129208155]
h-ABCD produces random hypergraphs with distributions of ground-truth community sizes and degrees following power-law.
As in the original ABCD, the new model h-ABCD can produce hypergraphs with various levels of noise.
arXiv Detail & Related papers (2022-10-26T20:06:56Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Consistency of Spectral Clustering on Hierarchical Stochastic Block
Models [5.983753938303726]
We study the hierarchy of communities in real-world networks under a generic block model.
We prove the strong consistency of this method under a wide range of model parameters.
Unlike most of existing work, our theory covers multiscale networks where the connection probabilities may differ by orders of magnitude.
arXiv Detail & Related papers (2020-04-30T01:08:59Z) - Struct-MMSB: Mixed Membership Stochastic Blockmodels with Interpretable
Structured Priors [13.712395104755783]
Mixed membership blockmodel (MMSB) is a popular framework for community detection and network generation.
We present a flexible MMSB model, textitStruct-MMSB, that uses a recently developed statistical relational learning model, hinge-loss Markov random fields (HL-MRFs)
Our model is capable of learning latent characteristics in real-world networks via meaningful latent variables encoded as a complex combination of observed features and membership distributions.
arXiv Detail & Related papers (2020-02-21T19:32:32Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - Artificial Benchmark for Community Detection (ABCD): Fast Random Graph
Model with Community Structure [5.8010446129208155]
We provide an alternative random graph model with community structure and power-law distribution for both degrees and community sizes, the Artificial Benchmark for Community Detection (ABCD)
ABCD is fast, simple, and can be easily tuned to allow the user to make a smooth transition between the two extremes: pure (independent) communities and random graph with no community structure.
arXiv Detail & Related papers (2020-01-14T17:20:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.