Revisiting Graph based Collaborative Filtering: A Linear Residual Graph
Convolutional Network Approach
- URL: http://arxiv.org/abs/2001.10167v1
- Date: Tue, 28 Jan 2020 04:41:25 GMT
- Title: Revisiting Graph based Collaborative Filtering: A Linear Residual Graph
Convolutional Network Approach
- Authors: Lei Chen, Le Wu, Richang Hong, Kun Zhang, Meng Wang
- Abstract summary: Graph Convolutional Networks (GCNs) are state-of-the-art graph based representation learning models.
In this paper, we revisit GCN based Collaborative Filtering (CF) based Recommender Systems (RS)
We show that removing non-linearities would enhance recommendation performance, consistent with the theories in simple graph convolutional networks.
We propose a residual network structure that is specifically designed for CF with user-item interaction modeling.
- Score: 55.44107800525776
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Convolutional Networks (GCNs) are state-of-the-art graph based
representation learning models by iteratively stacking multiple layers of
convolution aggregation operations and non-linear activation operations.
Recently, in Collaborative Filtering (CF) based Recommender Systems (RS), by
treating the user-item interaction behavior as a bipartite graph, some
researchers model higher-layer collaborative signals with GCNs. These GCN based
recommender models show superior performance compared to traditional works.
However, these models suffer from training difficulty with non-linear
activations for large user-item graphs. Besides, most GCN based models could
not model deeper layers due to the over smoothing effect with the graph
convolution operation. In this paper, we revisit GCN based CF models from two
aspects. First, we empirically show that removing non-linearities would enhance
recommendation performance, which is consistent with the theories in simple
graph convolutional networks. Second, we propose a residual network structure
that is specifically designed for CF with user-item interaction modeling, which
alleviates the over smoothing problem in graph convolution aggregation
operation with sparse user-item interaction data. The proposed model is a
linear model and it is easy to train, scale to large datasets, and yield better
efficiency and effectiveness on two real datasets. We publish the source code
at https://github.com/newlei/LRGCCF.
Related papers
- Challenging the Myth of Graph Collaborative Filtering: a Reasoned and Reproducibility-driven Analysis [50.972595036856035]
We present a code that successfully replicates results from six popular and recent graph recommendation models.
We compare these graph models with traditional collaborative filtering models that historically performed well in offline evaluations.
By investigating the information flow from users' neighborhoods, we aim to identify which models are influenced by intrinsic features in the dataset structure.
arXiv Detail & Related papers (2023-08-01T09:31:44Z) - AGNN: Alternating Graph-Regularized Neural Networks to Alleviate
Over-Smoothing [29.618952407794776]
We propose an Alternating Graph-regularized Neural Network (AGNN) composed of Graph Convolutional Layer (GCL) and Graph Embedding Layer (GEL)
GEL is derived from the graph-regularized optimization containing Laplacian embedding term, which can alleviate the over-smoothing problem.
AGNN is evaluated via a large number of experiments including performance comparison with some multi-layer or multi-order graph neural networks.
arXiv Detail & Related papers (2023-04-14T09:20:03Z) - DRGCN: Dynamic Evolving Initial Residual for Deep Graph Convolutional
Networks [19.483662490506646]
We propose a novel model called Dynamic evolving initial Residual Graph Convolutional Network (DRGCN)
Our experimental results show that our model effectively relieves the problem of over-smoothing in deep GCNs.
Our model reaches new SOTA results on the large-scale ogbn-arxiv dataset of Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-02-10T06:57:12Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Causal Incremental Graph Convolution for Recommender System Retraining [89.25922726558875]
Real-world recommender system needs to be regularly retrained to keep with the new data.
In this work, we consider how to efficiently retrain graph convolution network (GCN) based recommender models.
arXiv Detail & Related papers (2021-08-16T04:20:09Z) - Training Robust Graph Neural Networks with Topology Adaptive Edge
Dropping [116.26579152942162]
Graph neural networks (GNNs) are processing architectures that exploit graph structural information to model representations from network data.
Despite their success, GNNs suffer from sub-optimal generalization performance given limited training data.
This paper proposes Topology Adaptive Edge Dropping to improve generalization performance and learn robust GNN models.
arXiv Detail & Related papers (2021-06-05T13:20:36Z) - Robust Optimization as Data Augmentation for Large-scale Graphs [117.2376815614148]
We propose FLAG (Free Large-scale Adversarial Augmentation on Graphs), which iteratively augments node features with gradient-based adversarial perturbations during training.
FLAG is a general-purpose approach for graph data, which universally works in node classification, link prediction, and graph classification tasks.
arXiv Detail & Related papers (2020-10-19T21:51:47Z) - RGCF: Refined Graph Convolution Collaborative Filtering with concise and
expressive embedding [42.46797662323393]
We develop a new GCN-based Collaborative Filtering model, named Refined Graph convolution Collaborative Filtering(RGCF)
RGCF is more capable for capturing the implicit high-order connectivities inside the graph and the resultant vector representations are more expressive.
We conduct extensive experiments on three public million-size datasets, demonstrating that our RGCF significantly outperforms state-of-the-art models.
arXiv Detail & Related papers (2020-07-07T12:26:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.