Adversarial Permutation Guided Node Representations for Link Prediction
- URL: http://arxiv.org/abs/2012.08974v2
- Date: Sun, 28 Mar 2021 04:13:23 GMT
- Title: Adversarial Permutation Guided Node Representations for Link Prediction
- Authors: Indradyumna Roy, Abir De, Soumen Chakrabarti
- Abstract summary: Link prediction (LP) algorithm identifies node pairs between which new edges will likely materialize in future.
Most LP algorithms estimate a score for currently non-neighboring node pairs, and rank them by this score.
We propose PermGNN, which aggregates neighbor features using a recurrent, order-sensitive aggregator and directly minimizes an LP loss while it is attacked' by adversarial generator of neighbor permutations.
- Score: 27.31800918961859
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: After observing a snapshot of a social network, a link prediction (LP)
algorithm identifies node pairs between which new edges will likely materialize
in future. Most LP algorithms estimate a score for currently non-neighboring
node pairs, and rank them by this score. Recent LP systems compute this score
by comparing dense, low dimensional vector representations of nodes. Graph
neural networks (GNNs), in particular graph convolutional networks (GCNs), are
popular examples. For two nodes to be meaningfully compared, their embeddings
should be indifferent to reordering of their neighbors. GNNs typically use
simple, symmetric set aggregators to ensure this property, but this design
decision has been shown to produce representations with limited expressive
power. Sequence encoders are more expressive, but are permutation sensitive by
design. Recent efforts to overcome this dilemma turn out to be unsatisfactory
for LP tasks. In response, we propose PermGNN, which aggregates neighbor
features using a recurrent, order-sensitive aggregator and directly minimizes
an LP loss while it is `attacked' by adversarial generator of neighbor
permutations. By design, PermGNN{} has more expressive power compared to
earlier symmetric aggregators. Next, we devise an optimization framework to map
PermGNN's node embeddings to a suitable locality-sensitive hash, which speeds
up reporting the top-$K$ most likely edges for the LP task. Our experiments on
diverse datasets show that \our outperforms several state-of-the-art link
predictors by a significant margin, and can predict the most likely edges fast.
Related papers
- Efficient Link Prediction via GNN Layers Induced by Negative Sampling [92.05291395292537]
Graph neural networks (GNNs) for link prediction can loosely be divided into two broad categories.
First, emphnode-wise architectures pre-compute individual embeddings for each node that are later combined by a simple decoder to make predictions.
Second, emphedge-wise methods rely on the formation of edge-specific subgraph embeddings to enrich the representation of pair-wise relationships.
arXiv Detail & Related papers (2023-10-14T07:02:54Z) - Refined Edge Usage of Graph Neural Networks for Edge Prediction [51.06557652109059]
We propose a novel edge prediction paradigm named Edge-aware Message PassIng neuRal nEtworks (EMPIRE)
We first introduce an edge splitting technique to specify use of each edge where each edge is solely used as either the topology or the supervision.
In order to emphasize the differences between pairs connected by supervision edges and pairs unconnected, we further weight the messages to highlight the relative ones that can reflect the differences.
arXiv Detail & Related papers (2022-12-25T23:19:56Z) - Graph Neural Networks for Link Prediction with Subgraph Sketching [15.808095529382138]
We propose a novel full-graph GNN called ELPH (Efficient Link Prediction with Hashing)
It passes subgraph sketches as messages to approximate the key components of SGNNs without explicit subgraph construction.
It outperforms existing SGNN models on many standard LP benchmarks while being orders of magnitude faster.
arXiv Detail & Related papers (2022-09-30T14:20:07Z) - Simplifying Node Classification on Heterophilous Graphs with Compatible
Label Propagation [6.071760028190454]
We show that a well-known graph algorithm, Label Propagation, combined with a shallow neural network can achieve comparable performance to GNNs in semi-supervised node classification on graphs with high homophily.
In this paper, we show that this approach falls short on graphs with low homophily, where nodes often connect to the nodes of the opposite classes.
Our algorithm first learns the class compatibility matrix and then aggregates label predictions using LP algorithm weighted by class compatibilities.
arXiv Detail & Related papers (2022-05-19T08:34:34Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using
Vector Quantization [70.8567058758375]
VQ-GNN is a universal framework to scale up any convolution-based GNNs using Vector Quantization (VQ) without compromising the performance.
Our framework avoids the "neighbor explosion" problem of GNNs using quantized representations combined with a low-rank version of the graph convolution matrix.
arXiv Detail & Related papers (2021-10-27T11:48:50Z) - Graph Pointer Neural Networks [11.656981519694218]
We present Graph Pointer Neural Networks (GPNN) to tackle the challenges mentioned above.
We leverage a pointer network to select the most relevant nodes from a large amount of multi-hop neighborhoods.
The GPNN significantly improves the classification performance of state-of-the-art methods.
arXiv Detail & Related papers (2021-10-03T10:18:25Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.