Graph Pointer Neural Networks
- URL: http://arxiv.org/abs/2110.00973v1
- Date: Sun, 3 Oct 2021 10:18:25 GMT
- Title: Graph Pointer Neural Networks
- Authors: Tianmeng Yang, Yujing Wang, Zhihan Yue, Yaming Yang, Yunhai Tong, Jing
Bai
- Abstract summary: We present Graph Pointer Neural Networks (GPNN) to tackle the challenges mentioned above.
We leverage a pointer network to select the most relevant nodes from a large amount of multi-hop neighborhoods.
The GPNN significantly improves the classification performance of state-of-the-art methods.
- Score: 11.656981519694218
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have shown advantages in various graph-based
applications. Most existing GNNs assume strong homophily of graph structure and
apply permutation-invariant local aggregation of neighbors to learn a
representation for each node. However, they fail to generalize to heterophilic
graphs, where most neighboring nodes have different labels or features, and the
relevant nodes are distant. Few recent studies attempt to address this problem
by combining multiple hops of hidden representations of central nodes (i.e.,
multi-hop-based approaches) or sorting the neighboring nodes based on attention
scores (i.e., ranking-based approaches). As a result, these approaches have
some apparent limitations. On the one hand, multi-hop-based approaches do not
explicitly distinguish relevant nodes from a large number of multi-hop
neighborhoods, leading to a severe over-smoothing problem. On the other hand,
ranking-based models do not joint-optimize node ranking with end tasks and
result in sub-optimal solutions. In this work, we present Graph Pointer Neural
Networks (GPNN) to tackle the challenges mentioned above. We leverage a pointer
network to select the most relevant nodes from a large amount of multi-hop
neighborhoods, which constructs an ordered sequence according to the
relationship with the central node. 1D convolution is then applied to extract
high-level features from the node sequence. The pointer-network-based ranker in
GPNN is joint-optimized with other parts in an end-to-end manner. Extensive
experiments are conducted on six public node classification datasets with
heterophilic graphs. The results show that GPNN significantly improves the
classification performance of state-of-the-art methods. In addition, analyses
also reveal the privilege of the proposed GPNN in filtering out irrelevant
neighbors and reducing over-smoothing.
Related papers
- Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - On Local Aggregation in Heterophilic Graphs [11.100606980915144]
We show that properly tuned classical GNNs and multi-layer perceptrons match or exceed the accuracy of recent long-range aggregation methods on heterophilic graphs.
We propose the Neighborhood Information Content(NIC) metric, which is a novel information-theoretic graph metric.
arXiv Detail & Related papers (2021-06-06T19:12:31Z) - Hop-Aware Dimension Optimization for Graph Neural Networks [11.341455005324104]
We propose a simple yet effective ladder-style GNN architecture, namely LADDER-GNN.
Specifically, we separate messages from different hops and assign different dimensions for them before concatenating them to obtain the node representation.
Results show that the proposed simple hop-aware representation learning solution can achieve state-of-the-art performance on most datasets.
arXiv Detail & Related papers (2021-05-30T10:12:56Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - NCGNN: Node-level Capsule Graph Neural Network [45.23653314235767]
Node-level Capsule Graph Neural Network (NCGNN) represents nodes as groups of capsules.
novel dynamic routing procedure is developed to adaptively select appropriate capsules for aggregation.
NCGNN can well address the over-smoothing issue and outperforms the state of the arts by producing better node embeddings for classification.
arXiv Detail & Related papers (2020-12-07T06:46:17Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.