Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs
- URL: http://arxiv.org/abs/2212.08171v2
- Date: Wed, 23 Aug 2023 01:52:33 GMT
- Title: Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs
- Authors: Alejandro Parada-Mayorga and Zhiyang Wang and Alejandro Ribeiro
- Abstract summary: We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space.
We prove that those low dimensional representations constitute a convergent sequence of graphs and graph signals.
We observe that graphon pooling performs significantly better than other approaches proposed in the literature when dimensionality reduction ratios between layers are large.
- Score: 131.53471236405628
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper we propose a pooling approach for convolutional information
processing on graphs relying on the theory of graphons and limits of dense
graph sequences. We present three methods that exploit the induced graphon
representation of graphs and graph signals on partitions of [0, 1]2 in the
graphon space. As a result we derive low dimensional representations of the
convolutional operators, while a dimensionality reduction of the signals is
achieved by simple local interpolation of functions in L2([0, 1]). We prove
that those low dimensional representations constitute a convergent sequence of
graphs and graph signals, respectively. The methods proposed and the
theoretical guarantees that we provide show that the reduced graphs and signals
inherit spectral-structural properties of the original quantities. We evaluate
our approach with a set of numerical experiments performed on graph neural
networks (GNNs) that rely on graphon pooling. We observe that graphon pooling
performs significantly better than other approaches proposed in the literature
when dimensionality reduction ratios between layers are large. We also observe
that when graphon pooling is used we have, in general, less overfitting and
lower computational cost.
Related papers
- Generalized Graphon Process: Convergence of Graph Frequencies in
Stretched Cut Distance [30.279435887366578]
sparse graph sequences converge to the trivial graphon under the conventional definition of cut distance.
We utilize the concepts of generalized graphons and stretched cut distance to describe the convergence of sparse graph sequences.
Our results indicate the possibility of transfer learning between sparse graphs.
arXiv Detail & Related papers (2023-09-11T06:34:46Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z) - Pyramidal Reservoir Graph Neural Network [18.632681846787246]
We propose a deep Graph Neural Network (GNN) model that alternates two types of layers.
We show how graph pooling can reduce the computational complexity of the model.
Our proposed approach to the design of RC-based GNNs offers an advantageous and principled trade-off between accuracy and complexity.
arXiv Detail & Related papers (2021-04-10T08:34:09Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Wasserstein-based Graph Alignment [56.84964475441094]
We cast a new formulation for the one-to-many graph alignment problem, which aims at matching a node in the smaller graph with one or more nodes in the larger graph.
We show that our method leads to significant improvements with respect to the state-of-the-art algorithms for each of these tasks.
arXiv Detail & Related papers (2020-03-12T22:31:59Z) - Graphon Pooling in Graph Neural Networks [169.09536309161314]
Graph neural networks (GNNs) have been used effectively in different applications involving the processing of signals on irregular structures modeled by graphs.
We propose a new strategy for pooling and sampling on GNNs using graphons which preserves the spectral properties of the graph.
arXiv Detail & Related papers (2020-03-03T21:04:20Z) - The Power of Graph Convolutional Networks to Distinguish Random Graph
Models: Short Version [27.544219236164764]
Graph convolutional networks (GCNs) are a widely used method for graph representation learning.
We investigate the power of GCNs to distinguish between different random graph models on the basis of the embeddings of their sample graphs.
arXiv Detail & Related papers (2020-02-13T17:58:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.