On the Effectiveness of Hybrid Pooling in Mixup-Based Graph Learning for Language Processing
- URL: http://arxiv.org/abs/2210.03123v3
- Date: Wed, 22 May 2024 02:00:28 GMT
- Title: On the Effectiveness of Hybrid Pooling in Mixup-Based Graph Learning for Language Processing
- Authors: Zeming Dong, Qiang Hu, Zhenya Zhang, Yuejun Guo, Maxime Cordy, Mike Papadakis, Yves Le Traon, Jianjun Zhao,
- Abstract summary: Graph neural network (GNN)-based graph learning has been popular in natural language and programming language processing.
Recently, to enhance GNNs in graph learning tasks, Manifold-Mixup has been widely adopted.
This paper explores how graph pooling operators affect the performance of Mixup-based graph learning.
- Score: 20.812886172494082
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural network (GNN)-based graph learning has been popular in natural language and programming language processing, particularly in text and source code classification. Typically, GNNs are constructed by incorporating alternating layers which learn transformations of graph node features, along with graph pooling layers that use graph pooling operators (e.g., Max-pooling) to effectively reduce the number of nodes while preserving the semantic information of the graph. Recently, to enhance GNNs in graph learning tasks, Manifold-Mixup, a data augmentation technique that produces synthetic graph data by linearly mixing a pair of graph data and their labels, has been widely adopted. However, the performance of Manifold-Mixup can be highly affected by graph pooling operators, and there have not been many studies that are dedicated to uncovering such affection. To bridge this gap, we take an early step to explore how graph pooling operators affect the performance of Mixup-based graph learning. To that end, we conduct a comprehensive empirical study by applying Manifold-Mixup to a formal characterization of graph pooling based on 11 graph pooling operations (9 hybrid pooling operators, 2 non-hybrid pooling operators). The experimental results on both natural language datasets (Gossipcop, Politifact) and programming language datasets (JAVA250, Python800) demonstrate that hybrid pooling operators are more effective for Manifold-Mixup than the standard Max-pooling and the state-of-the-art graph multiset transformer (GMT) pooling, in terms of producing more accurate and robust GNN models.
Related papers
- Diffusion on Graph: Augmentation of Graph Structure for Node Classification [7.9233221247736205]
We propose on Graph Diffusion (DoG), which generates synthetic graph structures to boost the performance of graph neural networks (GNNs)
The synthetic graph structures generated by DoG are combined with the original graph to form an augmented graph for the training of node-level learning tasks.
To mitigate the adverse effect of the noise introduced by the synthetic graph structures, a low-rank regularization method is proposed.
arXiv Detail & Related papers (2025-03-16T16:39:25Z) - SPGNN: Recognizing Salient Subgraph Patterns via Enhanced Graph Convolution and Pooling [25.555741218526464]
Graph neural networks (GNNs) have revolutionized the field of machine learning on non-Euclidean data such as graphs and networks.
We propose a concatenation-based graph convolution mechanism that injectively updates node representations.
We also design a novel graph pooling module, called WL-SortPool, to learn important subgraph patterns in a deep-learning manner.
arXiv Detail & Related papers (2024-04-21T13:11:59Z) - Co-attention Graph Pooling for Efficient Pairwise Graph Interaction
Learning [19.58671020943416]
Graph Neural Networks (GNNs) have proven to be effective in processing and learning from graph-structured data.
We propose a novel and efficient graph-level approach for extracting interaction representations using co-attention in graph pooling.
Our method, Co-Attention Graph Pooling (CAGPool), exhibits competitive performance relative to existing methods in both classification and regression tasks.
arXiv Detail & Related papers (2023-07-28T07:53:34Z) - Graph Mixup with Soft Alignments [49.61520432554505]
We study graph data augmentation by mixup, which has been used successfully on images.
We propose S-Mixup, a simple yet effective mixup method for graph classification by soft alignments.
arXiv Detail & Related papers (2023-06-11T22:04:28Z) - Graph Mixture of Experts: Learning on Large-Scale Graphs with Explicit
Diversity Modeling [60.0185734837814]
Graph neural networks (GNNs) have found extensive applications in learning from graph data.
To bolster the generalization capacity of GNNs, it has become customary to augment training graph structures with techniques like graph augmentations.
This study introduces the concept of Mixture-of-Experts (MoE) to GNNs, with the aim of augmenting their capacity to adapt to a diverse range of training graph structures.
arXiv Detail & Related papers (2023-04-06T01:09:36Z) - Edge but not Least: Cross-View Graph Pooling [76.71497833616024]
This paper presents a cross-view graph pooling (Co-Pooling) method to better exploit crucial graph structure information.
Through cross-view interaction, edge-view pooling and node-view pooling seamlessly reinforce each other to learn more informative graph-level representations.
arXiv Detail & Related papers (2021-09-24T08:01:23Z) - Graph Pooling via Coarsened Graph Infomax [9.045707667111873]
We propose Coarsened GraphPool Infomaxing (CGI) to maximize the mutual information between the input and the coarsened graph of each pooling layer.
To achieve mutual information neural, we apply contrastive learning and propose a self-attention-based algorithm for learning positive and negative samples.
arXiv Detail & Related papers (2021-05-04T03:50:21Z) - Accurate Learning of Graph Representations with Graph Multiset Pooling [45.72542969364438]
We propose a Graph Multiset Transformer (GMT) that captures the interaction between nodes according to their structural dependencies.
Our experimental results show that GMT significantly outperforms state-of-the-art graph pooling methods on graph classification benchmarks.
arXiv Detail & Related papers (2021-02-23T07:45:58Z) - Second-Order Pooling for Graph Neural Networks [62.13156203025818]
We propose to use second-order pooling as graph pooling, which naturally solves the above challenges.
We show that direct use of second-order pooling with graph neural networks leads to practical problems.
We propose two novel global graph pooling methods based on second-order pooling; namely, bilinear mapping and attentional second-order pooling.
arXiv Detail & Related papers (2020-07-20T20:52:36Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - CoSimGNN: Towards Large-scale Graph Similarity Computation [5.17905821006887]
Graph Neural Networks (GNNs) provide a data-driven solution for this task.
Existing GNN-based methods, which either respectively embeds two graphs or deploy cross-graph interactions for whole graph pairs, are still not able to achieve competitive results.
We propose the "embedding-coarsening-matching" framework CoSimGNN, which first embeds and coarsens large graphs with adaptive pooling operation and then deploys fine-grained interactions on the coarsened graphs for final similarity scores.
arXiv Detail & Related papers (2020-05-14T16:33:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.