Generalized Graphon Process: Convergence of Graph Frequencies in
Stretched Cut Distance
- URL: http://arxiv.org/abs/2309.05260v1
- Date: Mon, 11 Sep 2023 06:34:46 GMT
- Title: Generalized Graphon Process: Convergence of Graph Frequencies in
Stretched Cut Distance
- Authors: Xingchao Jian, Feng Ji, Wee Peng Tay
- Abstract summary: sparse graph sequences converge to the trivial graphon under the conventional definition of cut distance.
We utilize the concepts of generalized graphons and stretched cut distance to describe the convergence of sparse graph sequences.
Our results indicate the possibility of transfer learning between sparse graphs.
- Score: 30.279435887366578
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphons have traditionally served as limit objects for dense graph
sequences, with the cut distance serving as the metric for convergence.
However, sparse graph sequences converge to the trivial graphon under the
conventional definition of cut distance, which make this framework inadequate
for many practical applications. In this paper, we utilize the concepts of
generalized graphons and stretched cut distance to describe the convergence of
sparse graph sequences. Specifically, we consider a random graph process
generated from a generalized graphon. This random graph process converges to
the generalized graphon in stretched cut distance. We use this random graph
process to model the growing sparse graph, and prove the convergence of the
adjacency matrices' eigenvalues. We supplement our findings with experimental
validation. Our results indicate the possibility of transfer learning between
sparse graphs.
Related papers
- Generalization Bounds for Message Passing Networks on Mixture of Graphons [21.457225542391267]
We study the generalization capabilities of Message Passing Neural Networks (MPNNs)
We derive generalization bounds specifically for MPNNs with normalized sum aggregation and mean aggregation.
Our results imply that MPNNs with higher complexity than the size of the training set can still generalize effectively.
arXiv Detail & Related papers (2024-04-04T14:26:47Z) - Sampling and Uniqueness Sets in Graphon Signal Processing [136.68956350251418]
We study the properties of sampling sets on families of large graphs by leveraging the theory of graphons and graph limits.
We exploit the convergence results to provide an algorithm that obtains approximately close to optimal sampling sets.
arXiv Detail & Related papers (2024-01-11T22:31:48Z) - A Poincaré Inequality and Consistency Results for Signal Sampling on Large Graphs [34.99659089854587]
We introduce a signal sampling theory for a type of graph limit -- the graphon.
We show that unique sampling sets on a convergent graph sequence converge to unique sampling sets on the graphon.
We then propose a related graphon signal sampling algorithm for large graphs, and demonstrate its good empirical performance on graph machine learning tasks.
arXiv Detail & Related papers (2023-11-17T16:04:31Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs [131.53471236405628]
We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space.
We prove that those low dimensional representations constitute a convergent sequence of graphs and graph signals.
We observe that graphon pooling performs significantly better than other approaches proposed in the literature when dimensionality reduction ratios between layers are large.
arXiv Detail & Related papers (2022-12-15T22:11:34Z) - Demystifying Graph Convolution with a Simple Concatenation [6.542119695695405]
We quantify the information overlap between graph topology, node features, and labels.
We show that graph concatenation is a simple but more flexible alternative to graph convolution.
arXiv Detail & Related papers (2022-07-18T16:39:33Z) - G-Mixup: Graph Data Augmentation for Graph Classification [55.63157775049443]
Mixup has shown superiority in improving the generalization and robustness of neural networks by interpolating features and labels between two random samples.
We propose $mathcalG$-Mixup to augment graphs for graph classification by interpolating the generator (i.e., graphon) of different classes of graphs.
Experiments show that $mathcalG$-Mixup substantially improves the generalization and robustness of GNNs.
arXiv Detail & Related papers (2022-02-15T04:09:44Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - The Power of Graph Convolutional Networks to Distinguish Random Graph
Models: Short Version [27.544219236164764]
Graph convolutional networks (GCNs) are a widely used method for graph representation learning.
We investigate the power of GCNs to distinguish between different random graph models on the basis of the embeddings of their sample graphs.
arXiv Detail & Related papers (2020-02-13T17:58:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.