Position-Sensing Graph Neural Networks: Proactively Learning Nodes
Relative Positions
- URL: http://arxiv.org/abs/2105.11346v1
- Date: Mon, 24 May 2021 15:30:30 GMT
- Title: Position-Sensing Graph Neural Networks: Proactively Learning Nodes
Relative Positions
- Authors: Zhenyue Qin and Saeed Anwar and Dongwoo Kim and Yang Liu and Pan Ji
and Tom Gedeon
- Abstract summary: Most existing graph neural networks (GNNs) learn node embeddings using the framework of message passing and aggregation.
We propose Position-Sensing Graph Neural Networks (PSGNNs), learning how to choose anchors in a back-propagatable fashion.
PSGNNs on average boost AUC more than 14% for pairwise node classification and 18% for link prediction.
- Score: 26.926733376090052
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most existing graph neural networks (GNNs) learn node embeddings using the
framework of message passing and aggregation. Such GNNs are incapable of
learning relative positions between graph nodes within a graph. To empower GNNs
with the awareness of node positions, some nodes are set as anchors. Then,
using the distances from a node to the anchors, GNNs can infer relative
positions between nodes. However, P-GNNs arbitrarily select anchors, leading to
compromising position-awareness and feature extraction. To eliminate this
compromise, we demonstrate that selecting evenly distributed and asymmetric
anchors is essential. On the other hand, we show that choosing anchors that can
aggregate embeddings of all the nodes within a graph is NP-hard. Therefore,
devising efficient optimal algorithms in a deterministic approach is
practically not feasible. To ensure position-awareness and bypass
NP-completeness, we propose Position-Sensing Graph Neural Networks (PSGNNs),
learning how to choose anchors in a back-propagatable fashion. Experiments
verify the effectiveness of PSGNNs against state-of-the-art GNNs, substantially
improving performance on various synthetic and real-world graph datasets while
enjoying stable scalability. Specifically, PSGNNs on average boost AUC more
than 14% for pairwise node classification and 18% for link prediction over the
existing state-of-the-art position-aware methods. Our source code is publicly
available at: https://github.com/ZhenyueQin/PSGNN
Related papers
- Bring Your Own View: Graph Neural Networks for Link Prediction with
Personalized Subgraph Selection [57.34881616131377]
We introduce a Personalized Subgraph Selector (PS2) as a plug-and-play framework to automatically, personally, and inductively identify optimal subgraphs for different edges.
PS2 is instantiated as a bi-level optimization problem that can be efficiently solved differently.
We suggest a brand-new angle towards GNNLP training: by first identifying the optimal subgraphs for edges; and then focusing on training the inference model by using the sampled subgraphs.
arXiv Detail & Related papers (2022-12-23T17:30:19Z) - Every Node Counts: Improving the Training of Graph Neural Networks on
Node Classification [9.539495585692007]
We propose novel objective terms for the training of GNNs for node classification.
Our first term seeks to maximize the mutual information between node and label features.
Our second term promotes anisotropic smoothness in the prediction maps.
arXiv Detail & Related papers (2022-11-29T23:25:14Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Position-based Hash Embeddings For Scaling Graph Neural Networks [8.87527266373087]
Graph Neural Networks (GNNs) compute node representations by taking into account the topology of the node's ego-network and the features of the ego-network's nodes.
When the nodes do not have high-quality features, GNNs learn an embedding layer to compute node embeddings and use them as input features.
To reduce the memory associated with this embedding layer, hashing-based approaches, commonly used in applications like NLP and recommender systems, can potentially be used.
We present approaches that take advantage of the nodes' position in the graph to dramatically reduce the memory required.
arXiv Detail & Related papers (2021-08-31T22:42:25Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - GraphReach: Position-Aware Graph Neural Network using Reachability
Estimations [12.640837452980332]
GraphReach is a position-aware inductive GNN that captures the global positions of nodes through reachability estimations.
We show that this anchor selection problem is NP-hard and, consequently, develop a greedy (1-1/e) approximation.
Empirical evaluation against state-of-the-art GNN architectures reveal that GraphReach provides up to 40% relative improvement in accuracy.
arXiv Detail & Related papers (2020-08-19T14:30:03Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Label-Consistency based Graph Neural Networks for Semi-supervised Node
Classification [47.753422069515366]
Graph neural networks (GNNs) achieve remarkable success in graph-based semi-supervised node classification.
In this paper, we propose label-consistency based graph neural network(LC-GNN), leveraging node pairs unconnected but with the same labels to enlarge the receptive field of nodes in GNNs.
Experiments on benchmark datasets demonstrate the proposed LC-GNN outperforms traditional GNNs in graph-based semi-supervised node classification.
arXiv Detail & Related papers (2020-07-27T11:17:46Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.