Bridging Theory and Practice in Link Representation with Graph Neural Networks
- URL: http://arxiv.org/abs/2506.24018v1
- Date: Mon, 30 Jun 2025 16:22:15 GMT
- Title: Bridging Theory and Practice in Link Representation with Graph Neural Networks
- Authors: Veronica Lachi, Francesco Ferrini, Antonio Longa, Bruno Lepri, Andrea Passerini, Manfred Jaeger,
- Abstract summary: Graph Neural Networks (GNNs) are widely used to compute representations of node pairs for downstream tasks such as link prediction.<n>We introduce a unifying framework, $k_phi$-$k_rho$-$m$ framework, that subsumes existing message-passing link models.<n>We use a graph symmetry metric that quantifies the difficulty of distinguishing links and show that while expressive models may underperform on standard benchmarks, they significantly outperform simpler ones as symmetry increases.
- Score: 15.088089745469652
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNNs) are widely used to compute representations of node pairs for downstream tasks such as link prediction. Yet, theoretical understanding of their expressive power has focused almost entirely on graph-level representations. In this work, we shift the focus to links and provide the first comprehensive study of GNN expressiveness in link representation. We introduce a unifying framework, the $k_\phi$-$k_\rho$-$m$ framework, that subsumes existing message-passing link models and enables formal expressiveness comparisons. Using this framework, we derive a hierarchy of state-of-the-art methods and offer theoretical tools to analyze future architectures. To complement our analysis, we propose a synthetic evaluation protocol comprising the first benchmark specifically designed to assess link-level expressiveness. Finally, we ask: does expressiveness matter in practice? We use a graph symmetry metric that quantifies the difficulty of distinguishing links and show that while expressive models may underperform on standard benchmarks, they significantly outperform simpler ones as symmetry increases, highlighting the need for dataset-aware model selection.
Related papers
- Understanding the Design Principles of Link Prediction in Directed Settings [1.6727186769396276]
Link prediction is a widely studied task in Graph Representation Learning (GRL)<n>In this paper, we focus on the challenge of directed link prediction by evaluating keycencys that have been successful in undirected settings.<n>We propose simple but effective adaptations of theses to the directed link prediction task and demonstrate that these modifications produce competitive performance.
arXiv Detail & Related papers (2025-02-20T20:01:35Z) - Rethinking Link Prediction for Directed Graphs [73.36395969796804]
Link prediction for directed graphs is a crucial task with diverse real-world applications.<n>Recent advances in embedding methods and Graph Neural Networks (GNNs) have shown promising improvements.<n>We propose a unified framework to assess the expressiveness of existing methods, highlighting the impact of dual embeddings and decoder design on directed link prediction performance.
arXiv Detail & Related papers (2025-02-08T23:51:05Z) - Boosting Graph Neural Network Expressivity with Learnable Lanczos Constraints [7.605749412696919]
Graph Neural Networks (GNNs) excel in handling graph-structured data but often underperform in link prediction tasks.<n>We present a novel method to enhance the expressivity of GNNs by embedding induced subgraphs into the graph Laplacian matrix's eigenbasis.<n>We demonstrate the ability to distinguish graphs that are indistinguishable by 2-WL, while maintaining efficient time complexity.
arXiv Detail & Related papers (2024-08-22T12:22:00Z) - Link Prediction with Relational Hypergraphs [28.594243961681684]
Link prediction with knowledge graphs has been thoroughly studied in graph machine learning.<n>We propose a framework for link prediction with relational hypergraphs, unlocking applications of graph neural networks to fully relational structures.
arXiv Detail & Related papers (2024-02-06T15:05:40Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Graph Collaborative Reasoning [18.45161138837384]
Graph Collaborative Reasoning (GCR) can use the neighbor link information for relational reasoning on graphs from logical reasoning perspectives.
We provide a simple approach to translate a graph structure into logical expressions, so that the link prediction task can be converted into a neural logic reasoning problem.
To show the effectiveness of our work, we conduct experiments on graph-related tasks such as link prediction and recommendation based on commonly used benchmark datasets.
arXiv Detail & Related papers (2021-12-27T14:27:58Z) - GraphiT: Encoding Graph Structure in Transformers [37.33808493548781]
We show that viewing graphs as sets of node features and structural and positional information is able to outperform representations learned with classical graph neural networks (GNNs)
Our model, GraphiT, encodes such information by (i) leveraging relative positional encoding strategies in self-attention scores based on positive definite kernels on graphs, and (ii) enumerating and encoding local sub-structures such as paths of short length.
arXiv Detail & Related papers (2021-06-10T11:36:22Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Graph Information Bottleneck [77.21967740646784]
Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features.
Inheriting from the general Information Bottleneck (IB), GIB aims to learn the minimal sufficient representation for a given task.
We show that our proposed models are more robust than state-of-the-art graph defense models.
arXiv Detail & Related papers (2020-10-24T07:13:00Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Optimal Transport Graph Neural Networks [31.191844909335963]
Current graph neural network (GNN) architectures naively average or sum node embeddings into an aggregated graph representation.
We introduce OT-GNN, a model that computes graph embeddings using parametric prototypes.
arXiv Detail & Related papers (2020-06-08T14:57:39Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.