Subgraph Frequency Distribution Estimation using Graph Neural Networks
- URL: http://arxiv.org/abs/2207.06684v1
- Date: Thu, 14 Jul 2022 06:23:38 GMT
- Title: Subgraph Frequency Distribution Estimation using Graph Neural Networks
- Authors: Zhongren Chen, Xinyue Xu, Shengyi Jiang, Hao Wang and Lu Mi
- Abstract summary: We propose GNNS, a novel representational learning framework that utilizes graph neural networks to sample subgraphs efficiently for estimating their frequency distribution.
Our framework includes an inference model and a generative model that learns hierarchical embeddings of nodes, subgraphs, and graph types.
With the learned model and embeddings, subgraphs are sampled in a highly scalable and parallel way and the frequency distribution estimation is then performed based on these sampled subgraphs.
- Score: 17.02487540304784
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Small subgraphs (graphlets) are important features to describe fundamental
units of a large network. The calculation of the subgraph frequency
distributions has a wide application in multiple domains including biology and
engineering. Unfortunately due to the inherent complexity of this task, most of
the existing methods are computationally intensive and inefficient. In this
work, we propose GNNS, a novel representational learning framework that
utilizes graph neural networks to sample subgraphs efficiently for estimating
their frequency distribution. Our framework includes an inference model and a
generative model that learns hierarchical embeddings of nodes, subgraphs, and
graph types. With the learned model and embeddings, subgraphs are sampled in a
highly scalable and parallel way and the frequency distribution estimation is
then performed based on these sampled subgraphs. Eventually, our methods
achieve comparable accuracy and a significant speedup by three orders of
magnitude compared to existing methods.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Leveraging Graph Diffusion Models for Network Refinement Tasks [72.54590628084178]
We propose a novel graph generative framework, SGDM, based on subgraph diffusion.
Our framework not only improves the scalability and fidelity of graph diffusion models, but also leverages the reverse process to perform novel, conditional generation tasks.
arXiv Detail & Related papers (2023-11-29T18:02:29Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - Convolutional Learning on Multigraphs [153.20329791008095]
We develop convolutional information processing on multigraphs and introduce convolutional multigraph neural networks (MGNNs)
To capture the complex dynamics of information diffusion within and across each of the multigraph's classes of edges, we formalize a convolutional signal processing model.
We develop a multigraph learning architecture, including a sampling procedure to reduce computational complexity.
The introduced architecture is applied towards optimal wireless resource allocation and a hate speech localization task, offering improved performance over traditional graph neural networks.
arXiv Detail & Related papers (2022-09-23T00:33:04Z) - Ordered Subgraph Aggregation Networks [19.18478955240166]
Subgraph-enhanced graph neural networks (GNNs) have emerged, provably boosting the expressive power of standard (message-passing) GNNs.
Here, we introduce a theoretical framework and extend the known expressivity results of subgraph-enhanced GNNs.
We show that increasing subgraph size always increases the expressive power and develop a better understanding of their limitations.
arXiv Detail & Related papers (2022-06-22T15:19:34Z) - Bayesian graph convolutional neural networks via tempered MCMC [0.41998444721319217]
Deep learning models, such as convolutional neural networks, have long been applied to image and multi-media tasks.
More recently, there has been more attention to unstructured data that can be represented via graphs.
These types of data are often found in health and medicine, social networks, and research data repositories.
arXiv Detail & Related papers (2021-04-17T04:03:25Z) - Subgraph Neural Networks [14.222887950206662]
We introduce SubGNN, a subgraph neural network to learn disentangled subgraph representations.
SubGNN performs exceptionally well on challenging biomedical datasets.
arXiv Detail & Related papers (2020-06-18T13:54:30Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Ripple Walk Training: A Subgraph-based training framework for Large and
Deep Graph Neural Network [10.36962234388739]
We propose a general subgraph-based training framework, namely Ripple Walk Training (RWT), for deep and large graph neural networks.
RWT samples subgraphs from the full graph to constitute a mini-batch, and the full GNN is updated based on the mini-batch gradient.
Extensive experiments on different sizes of graphs demonstrate the effectiveness and efficiency of RWT in training various GNNs.
arXiv Detail & Related papers (2020-02-17T19:07:41Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.