Hierarchical BiGraph Neural Network as Recommendation Systems
- URL: http://arxiv.org/abs/2007.16000v1
- Date: Mon, 27 Jul 2020 18:01:41 GMT
- Title: Hierarchical BiGraph Neural Network as Recommendation Systems
- Authors: Dom Huh
- Abstract summary: We propose a hierarchical approach of using GNNs as recommendation systems and structuring the user-item features using a bigraph framework.
Our experimental results show competitive performance with current recommendation system methods and transferability.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks emerge as a promising modeling method for applications
dealing with datasets that are best represented in the graph domain. In
specific, developing recommendation systems often require addressing sparse
structured data which often lacks the feature richness in either the user
and/or item side and requires processing within the correct context for optimal
performance. These datasets intuitively can be mapped to and represented as
networks or graphs. In this paper, we propose the Hierarchical BiGraph Neural
Network (HBGNN), a hierarchical approach of using GNNs as recommendation
systems and structuring the user-item features using a bigraph framework. Our
experimental results show competitive performance with current recommendation
system methods and transferability.
Related papers
- TANGNN: a Concise, Scalable and Effective Graph Neural Networks with Top-m Attention Mechanism for Graph Representation Learning [7.879217146851148]
We propose an innovative Graph Neural Network (GNN) architecture that integrates a Top-m attention mechanism aggregation component and a neighborhood aggregation component.
To assess the effectiveness of our proposed model, we have applied it to citation sentiment prediction, a novel task previously unexplored in the GNN field.
arXiv Detail & Related papers (2024-11-23T05:31:25Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Preference and Concurrence Aware Bayesian Graph Neural Networks for
Recommender Systems [5.465420718331109]
Graph-based collaborative filtering methods have prevailing performance for recommender systems.
We propose an efficient generative model that jointly considers the preferences of users, the concurrence of items and some important graph structure information.
arXiv Detail & Related papers (2023-11-30T11:49:33Z) - Ordinal Graph Gamma Belief Network for Social Recommender Systems [54.9487910312535]
We develop a hierarchical Bayesian model termed ordinal graph factor analysis (OGFA), which jointly models user-item and user-user interactions.
OGFA not only achieves good recommendation performance, but also extracts interpretable latent factors corresponding to representative user preferences.
We extend OGFA to ordinal graph gamma belief network, which is a multi-stochastic-layer deep probabilistic model.
arXiv Detail & Related papers (2022-09-12T09:19:22Z) - GPN: A Joint Structural Learning Framework for Graph Neural Networks [36.38529113603987]
We propose a GNN-based joint learning framework that simultaneously learns the graph structure and the downstream task.
Our method is the first GNN-based bilevel optimization framework for resolving this task.
arXiv Detail & Related papers (2022-05-12T09:06:04Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Attention-Based Recommendation On Graphs [9.558392439655012]
Graph Neural Networks (GNN) have shown remarkable performance in different tasks.
In this study, we propose GARec as a model-based recommender system.
The presented method outperforms existing model-based, non-graph neural networks and graph neural networks in different MovieLens datasets.
arXiv Detail & Related papers (2022-01-04T21:02:02Z) - Graph Convolutional Embeddings for Recommender Systems [67.5973695167534]
We propose a graph convolutional embedding layer for N-partite graphs that processes user-item-context interactions.
More specifically, we define a graph convolutional embedding layer for N-partite graphs that processes user-item-context interactions.
arXiv Detail & Related papers (2021-03-05T10:46:16Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.