Maximal Independent Sets for Pooling in Graph Neural Networks
- URL: http://arxiv.org/abs/2307.13011v1
- Date: Mon, 24 Jul 2023 13:47:30 GMT
- Title: Maximal Independent Sets for Pooling in Graph Neural Networks
- Authors: Stevan Stanovic (ENSICAEN, UNICAEN), Benoit Ga\"uz\`ere (INSA Rouen
Normandie, UNIROUEN, ULH, LITIS), Luc Brun (ENSICAEN, UNICAEN)
- Abstract summary: CNNs have enabled major advances in image classification through convolution and pooling.
Traditional graph pooling methods suffer from at least one of the following drawbacks: Graph disconnection or overconnection, low decimation ratio, and deletion of large parts of graphs.
We present three pooling methods based on the notion of maximal independent sets that avoid these pitfalls.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convolutional Neural Networks (CNNs) have enabled major advances in image
classification through convolution and pooling. In particular, image pooling
transforms a connected discrete lattice into a reduced lattice with the same
connectivity and allows reduction functions to consider all pixels in an image.
However, there is no pooling that satisfies these properties for graphs. In
fact, traditional graph pooling methods suffer from at least one of the
following drawbacks: Graph disconnection or overconnection, low decimation
ratio, and deletion of large parts of graphs. In this paper, we present three
pooling methods based on the notion of maximal independent sets that avoid
these pitfalls. Our experimental results confirm the relevance of maximal
independent set constraints for graph pooling.
Related papers
- Curvature-based Pooling within Graph Neural Networks [3.1733862899654643]
Over-squashing and over-smoothing limit the capabilities of graph neural networks (GNNs)
CurvPool exploits the notion of curvature of a graph to adaptively identify structures responsible for both over-smoothing and over-squashing.
We compare it to other state-of-the-art pooling approaches and establish its competitiveness in terms of classification accuracy, computational complexity, and flexibility.
arXiv Detail & Related papers (2023-08-31T08:00:08Z) - Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs [131.53471236405628]
We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space.
We prove that those low dimensional representations constitute a convergent sequence of graphs and graph signals.
We observe that graphon pooling performs significantly better than other approaches proposed in the literature when dimensionality reduction ratios between layers are large.
arXiv Detail & Related papers (2022-12-15T22:11:34Z) - Graph Pooling with Maximum-Weight $k$-Independent Sets [12.251091325930837]
We introduce a graph coarsening mechanism based on the graph-theoretic concept of maximum-weight $k$-independent sets.
We prove theoretical guarantees for distortion bounds on path lengths, as well as the ability to preserve key topological properties in the coarsened graphs.
arXiv Detail & Related papers (2022-08-06T14:12:47Z) - Maximal Independent Vertex Set applied to Graph Pooling [0.0]
Convolutional neural networks (CNN) have enabled major advances in image classification through convolution and pooling.
In particular, image pooling transforms a connected discrete grid into a reduced grid with the same connectivity and allows reduction functions to take into account all the pixels of an image.
We propose to overcome both problems using a new pooling method, named MIVSPool.
arXiv Detail & Related papers (2022-08-02T14:42:58Z) - Edge but not Least: Cross-View Graph Pooling [76.71497833616024]
This paper presents a cross-view graph pooling (Co-Pooling) method to better exploit crucial graph structure information.
Through cross-view interaction, edge-view pooling and node-view pooling seamlessly reinforce each other to learn more informative graph-level representations.
arXiv Detail & Related papers (2021-09-24T08:01:23Z) - Maximum Entropy Weighted Independent Set Pooling for Graph Neural
Networks [7.398195748292981]
We propose a novel pooling layer for graph neural networks based on maximizing the mutual information between the pooled graph and the input graph.
We show that the input graph to the pooling layer can be viewed as a representation of a noisy communication channel.
We show that reaching the maximum mutual information is equivalent to finding a maximum weight independent set of the graph.
arXiv Detail & Related papers (2021-07-03T11:19:28Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Accurate Learning of Graph Representations with Graph Multiset Pooling [45.72542969364438]
We propose a Graph Multiset Transformer (GMT) that captures the interaction between nodes according to their structural dependencies.
Our experimental results show that GMT significantly outperforms state-of-the-art graph pooling methods on graph classification benchmarks.
arXiv Detail & Related papers (2021-02-23T07:45:58Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z) - Second-Order Pooling for Graph Neural Networks [62.13156203025818]
We propose to use second-order pooling as graph pooling, which naturally solves the above challenges.
We show that direct use of second-order pooling with graph neural networks leads to practical problems.
We propose two novel global graph pooling methods based on second-order pooling; namely, bilinear mapping and attentional second-order pooling.
arXiv Detail & Related papers (2020-07-20T20:52:36Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.