On the Power of Edge Independent Graph Models
- URL: http://arxiv.org/abs/2111.00048v1
- Date: Fri, 29 Oct 2021 19:12:14 GMT
- Title: On the Power of Edge Independent Graph Models
- Authors: Sudhanshu Chanpuriya, Cameron Musco, Konstantinos Sotiropoulos, and
Charalampos E. Tsourakakis
- Abstract summary: We study the limitations of edge independent random graph models, in which each edge is added to the graph independently with some probability.
We prove that subject to a bounded overlap condition, edge independent models are inherently limited in their ability to generate graphs with high triangle and other subgraph densities.
- Score: 22.085932117823738
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Why do many modern neural-network-based graph generative models fail to
reproduce typical real-world network characteristics, such as high triangle
density? In this work we study the limitations of edge independent random graph
models, in which each edge is added to the graph independently with some
probability. Such models include both the classic Erd\"{o}s-R\'{e}nyi and
stochastic block models, as well as modern generative models such as NetGAN,
variational graph autoencoders, and CELL. We prove that subject to a bounded
overlap condition, which ensures that the model does not simply memorize a
single graph, edge independent models are inherently limited in their ability
to generate graphs with high triangle and other subgraph densities. Notably,
such high densities are known to appear in real-world social networks and other
graphs. We complement our negative results with a simple generative model that
balances overlap and accuracy, performing comparably to more complex models in
reconstructing many graph statistics.
Related papers
- Advancing Graph Generation through Beta Diffusion [49.49740940068255]
Graph Beta Diffusion (GBD) is a generative model specifically designed to handle the diverse nature of graph data.
We propose a modulation technique that enhances the realism of generated graphs by stabilizing critical graph topology.
arXiv Detail & Related papers (2024-06-13T17:42:57Z) - Almost Surely Asymptotically Constant Graph Neural Networks [7.339728196535312]
We show that the output converges to a constant function, which upper-bounds what these classifiers can uniformly express.
This strong convergence phenomenon applies to a very wide class of GNNs, including state of the art models.
We empirically validate these findings, observing that the convergence phenomenon appears not only on random graphs but also on some real-world graphs.
arXiv Detail & Related papers (2024-03-06T17:40:26Z) - On the Role of Edge Dependency in Graph Generative Models [28.203109773986167]
We introduce a novel evaluation framework for generative models of graphs.
We focus on the importance of model-generated graph overlap to ensure both accuracy and edge-diversity.
Our results indicate that our simple, interpretable models provide competitive baselines to popular generative models.
arXiv Detail & Related papers (2023-12-06T18:54:27Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Model Inversion Attacks against Graph Neural Networks [65.35955643325038]
We study model inversion attacks against Graph Neural Networks (GNNs)
In this paper, we present GraphMI to infer the private training graph data.
Our experimental results show that such defenses are not sufficiently effective and call for more advanced defenses against privacy attacks.
arXiv Detail & Related papers (2022-09-16T09:13:43Z) - Node Copying: A Random Graph Model for Effective Graph Sampling [35.957719744856696]
We introduce the node copying model for constructing a distribution over graphs.
We show the usefulness of the copying model in three tasks.
We employ our proposed model to mitigate the effect of adversarial attacks on the graph topology.
arXiv Detail & Related papers (2022-08-04T04:04:49Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - An Interpretable Graph Generative Model with Heterophily [38.59200985962146]
We propose the first edge-independent graph generative model that is expressive enough to capture heterophily.
Our experiments demonstrate the effectiveness of our model for a variety of important application tasks.
arXiv Detail & Related papers (2021-11-04T17:34:39Z) - A Deep Latent Space Model for Graph Representation Learning [10.914558012458425]
We propose a Deep Latent Space Model (DLSM) for directed graphs to incorporate the traditional latent variable based generative model into deep learning frameworks.
Our proposed model consists of a graph convolutional network (GCN) encoder and a decoder, which are layer-wise connected by a hierarchical variational auto-encoder architecture.
Experiments on real-world datasets show that the proposed model achieves the state-of-the-art performances on both link prediction and community detection tasks.
arXiv Detail & Related papers (2021-06-22T12:41:19Z) - Auto-decoding Graphs [91.3755431537592]
The generative model is an auto-decoder that learns to synthesize graphs from latent codes.
Graphs are synthesized using self-attention modules that are trained to identify likely connectivity patterns.
arXiv Detail & Related papers (2020-06-04T14:23:01Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.