GraphReach: Position-Aware Graph Neural Network using Reachability
Estimations
- URL: http://arxiv.org/abs/2008.09657v4
- Date: Fri, 20 Aug 2021 11:04:30 GMT
- Title: GraphReach: Position-Aware Graph Neural Network using Reachability
Estimations
- Authors: Sunil Nishad and Shubhangi Agarwal and Arnab Bhattacharya and Sayan
Ranu
- Abstract summary: GraphReach is a position-aware inductive GNN that captures the global positions of nodes through reachability estimations.
We show that this anchor selection problem is NP-hard and, consequently, develop a greedy (1-1/e) approximation.
Empirical evaluation against state-of-the-art GNN architectures reveal that GraphReach provides up to 40% relative improvement in accuracy.
- Score: 12.640837452980332
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Majority of the existing graph neural networks (GNN) learn node embeddings
that encode their local neighborhoods but not their positions. Consequently,
two nodes that are vastly distant but located in similar local neighborhoods
map to similar embeddings in those networks. This limitation prevents accurate
performance in predictive tasks that rely on position information. In this
paper, we develop GraphReach, a position-aware inductive GNN that captures the
global positions of nodes through reachability estimations with respect to a
set of anchor nodes. The anchors are strategically selected so that
reachability estimations across all the nodes are maximized. We show that this
combinatorial anchor selection problem is NP-hard and, consequently, develop a
greedy (1-1/e) approximation heuristic. Empirical evaluation against
state-of-the-art GNN architectures reveal that GraphReach provides up to 40%
relative improvement in accuracy. In addition, it is more robust to adversarial
attacks.
Related papers
- Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Collaborative Graph Neural Networks for Attributed Network Embedding [63.39495932900291]
Graph neural networks (GNNs) have shown prominent performance on attributed network embedding.
We propose COllaborative graph Neural Networks--CONN, a tailored GNN architecture for network embedding.
arXiv Detail & Related papers (2023-07-22T04:52:27Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Position-based Hash Embeddings For Scaling Graph Neural Networks [8.87527266373087]
Graph Neural Networks (GNNs) compute node representations by taking into account the topology of the node's ego-network and the features of the ego-network's nodes.
When the nodes do not have high-quality features, GNNs learn an embedding layer to compute node embeddings and use them as input features.
To reduce the memory associated with this embedding layer, hashing-based approaches, commonly used in applications like NLP and recommender systems, can potentially be used.
We present approaches that take advantage of the nodes' position in the graph to dramatically reduce the memory required.
arXiv Detail & Related papers (2021-08-31T22:42:25Z) - Position-Sensing Graph Neural Networks: Proactively Learning Nodes
Relative Positions [26.926733376090052]
Most existing graph neural networks (GNNs) learn node embeddings using the framework of message passing and aggregation.
We propose Position-Sensing Graph Neural Networks (PSGNNs), learning how to choose anchors in a back-propagatable fashion.
PSGNNs on average boost AUC more than 14% for pairwise node classification and 18% for link prediction.
arXiv Detail & Related papers (2021-05-24T15:30:30Z) - Graph Attention Networks with Positional Embeddings [7.552100672006174]
Graph Neural Networks (GNNs) are deep learning methods which provide the current state of the art performance in node classification tasks.
We propose a framework, termed Graph Attentional Networks with Positional Embeddings (GAT-POS), to enhance GATs with positional embeddings.
We show that GAT-POS reaches remarkable improvement compared to strong GNN baselines and recent structural embedding enhanced GNNs on non-homophilic graphs.
arXiv Detail & Related papers (2021-05-09T22:13:46Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z) - EdgeNets:Edge Varying Graph Neural Networks [179.99395949679547]
This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
arXiv Detail & Related papers (2020-01-21T15:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.