Sheaf4Rec: Sheaf Neural Networks for Graph-based Recommender Systems
- URL: http://arxiv.org/abs/2304.09097v3
- Date: Sat, 16 Mar 2024 06:59:28 GMT
- Title: Sheaf4Rec: Sheaf Neural Networks for Graph-based Recommender Systems
- Authors: Antonio Purificato, Giulia Cassarà , Federico Siciliano, Pietro Liò, Fabrizio Silvestri,
- Abstract summary: We propose a cutting-edge model inspired by category theory: Sheaf4Rec.
Unlike single vector representations, Sheaf Neural Networks and their corresponding Laplacians represent each node (and edge) using a vector space.
Our proposed model exhibits a noteworthy relative improvement of up to 8.53% on F1-Score@10 and an impressive increase of up to 11.29% on NDCG@10.
- Score: 18.596875449579688
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent advancements in Graph Neural Networks (GNN) have facilitated their widespread adoption in various applications, including recommendation systems. GNNs have proven to be effective in addressing the challenges posed by recommendation systems by efficiently modeling graphs in which nodes represent users or items and edges denote preference relationships. However, current GNN techniques represent nodes by means of a single static vector, which may inadequately capture the intricate complexities of users and items. To overcome these limitations, we propose a solution integrating a cutting-edge model inspired by category theory: Sheaf4Rec. Unlike single vector representations, Sheaf Neural Networks and their corresponding Laplacians represent each node (and edge) using a vector space. Our approach takes advantage from this theory and results in a more comprehensive representation that can be effectively exploited during inference, providing a versatile method applicable to a wide range of graph-related tasks and demonstrating unparalleled performance. Our proposed model exhibits a noteworthy relative improvement of up to 8.53% on F1-Score@10 and an impressive increase of up to 11.29% on NDCG@10, outperforming existing state-of-the-art models such as Neural Graph Collaborative Filtering (NGCF), KGTORe and other recently developed GNN-based models. In addition to its superior predictive capabilities, Sheaf4Rec shows remarkable improvements in terms of efficiency: we observe substantial runtime improvements ranging from 2.5% up to 37% when compared to other GNN-based competitor models, indicating a more efficient way of handling information while achieving better performance. Code is available at https://github.com/antoniopurificato/Sheaf4Rec.
Related papers
- GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - How Expressive are Graph Neural Networks in Recommendation? [17.31401354442106]
Graph Neural Networks (GNNs) have demonstrated superior performance on various graph learning tasks, including recommendation.
Recent research has explored the expressiveness of GNNs in general, demonstrating that message passing GNNs are at most as powerful as the Weisfeiler-Lehman test.
We propose the topological closeness metric to evaluate GNNs' ability to capture the structural distance between nodes.
arXiv Detail & Related papers (2023-08-22T02:17:34Z) - Factor Graph Neural Networks [20.211455592922736]
Graph Neural Networks (GNNs) can learn powerful representations in an end-to-end fashion with great success in many real-world applications.
We propose Factor Graph Neural Networks (FGNNs) to effectively capture higher-order relations for inference and learning.
arXiv Detail & Related papers (2023-08-02T00:32:02Z) - Reducing Over-smoothing in Graph Neural Networks Using Relational
Embeddings [0.15619750966454563]
We propose a new simple, and efficient method to alleviate the effect of the over-smoothing problem in GNNs.
Our method can be used in combination with other methods to give the best performance.
arXiv Detail & Related papers (2023-01-07T19:26:04Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - Robust Optimization as Data Augmentation for Large-scale Graphs [117.2376815614148]
We propose FLAG (Free Large-scale Adversarial Augmentation on Graphs), which iteratively augments node features with gradient-based adversarial perturbations during training.
FLAG is a general-purpose approach for graph data, which universally works in node classification, link prediction, and graph classification tasks.
arXiv Detail & Related papers (2020-10-19T21:51:47Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Self-Enhanced GNN: Improving Graph Neural Networks Using Model Outputs [20.197085398581397]
Graph neural networks (GNNs) have received much attention recently because of their excellent performance on graph-based tasks.
We propose self-enhanced GNN (SEG), which improves the quality of the input data using the outputs of existing GNN models.
SEG consistently improves the performance of well-known GNN models such as GCN, GAT and SGC across different datasets.
arXiv Detail & Related papers (2020-02-18T12:27:16Z) - Revisiting Graph based Collaborative Filtering: A Linear Residual Graph
Convolutional Network Approach [55.44107800525776]
Graph Convolutional Networks (GCNs) are state-of-the-art graph based representation learning models.
In this paper, we revisit GCN based Collaborative Filtering (CF) based Recommender Systems (RS)
We show that removing non-linearities would enhance recommendation performance, consistent with the theories in simple graph convolutional networks.
We propose a residual network structure that is specifically designed for CF with user-item interaction modeling.
arXiv Detail & Related papers (2020-01-28T04:41:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.