LightGCN: Evaluated and Enhanced
- URL: http://arxiv.org/abs/2312.16183v1
- Date: Sun, 17 Dec 2023 15:18:18 GMT
- Title: LightGCN: Evaluated and Enhanced
- Authors: Milena Kapralova, Luca Pantea and Andrei Blahovici
- Abstract summary: LightGCN enables linear propagation of embeddings, enhancing performance.
We reproduce the original findings, assess LightGCN's robustness on diverse datasets and metrics, and explore Graph Diffusion as an augmentation of signal propagation in LightGCN.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper analyses LightGCN in the context of graph recommendation
algorithms. Despite the initial design of Graph Convolutional Networks for
graph classification, the non-linear operations are not always essential.
LightGCN enables linear propagation of embeddings, enhancing performance. We
reproduce the original findings, assess LightGCN's robustness on diverse
datasets and metrics, and explore Graph Diffusion as an augmentation of signal
propagation in LightGCN.
Related papers
- L^2GC:Lorentzian Linear Graph Convolutional Networks for Node Classification [12.69417276887153]
We propose a novel framework for Lorentzian linear GCN.
We map the learned features of graph nodes into hyperbolic space.
We then perform a Lorentzian linear feature transformation to capture the underlying tree-like structure of data.
arXiv Detail & Related papers (2024-03-10T02:16:13Z) - Nonlinear Correct and Smooth for Semi-Supervised Learning [1.622641093702668]
Graph-based semi-supervised learning (GSSL) has been used successfully in various applications.
We propose Correct and Smooth (NLCS), which improves the existing post-processing approach by incorporating non-linearity and higher-order representation.
arXiv Detail & Related papers (2023-10-09T14:33:32Z) - SStaGCN: Simplified stacking based graph convolutional networks [2.556756699768804]
Graph convolutional network (GCN) is a powerful model studied broadly in various graph structural data learning tasks.
We propose a novel GCN called SStaGCN (Simplified stacking based GCN) by utilizing the ideas of stacking and aggregation.
We show that SStaGCN can efficiently mitigate the over-smoothing problem of GCN.
arXiv Detail & Related papers (2021-11-16T05:00:08Z) - Dissecting the Diffusion Process in Linear Graph Convolutional Networks [71.30132908130581]
Graph Convolutional Networks (GCNs) have attracted more and more attention in recent years.
Recent works show that a linear GCN can achieve comparable performance to the original non-linear GCN.
We propose Decoupled Graph Convolution (DGC) that decouples the terminal time and the feature propagation steps.
arXiv Detail & Related papers (2021-02-22T02:45:59Z) - Bi-GCN: Binary Graph Convolutional Network [57.733849700089955]
We propose a Binary Graph Convolutional Network (Bi-GCN), which binarizes both the network parameters and input node features.
Our Bi-GCN can reduce the memory consumption by an average of 30x for both the network parameters and input data, and accelerate the inference speed by an average of 47x.
arXiv Detail & Related papers (2020-10-15T07:26:23Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z) - Knowledge Embedding Based Graph Convolutional Network [35.35776808660919]
This paper proposes a novel framework, namely the Knowledge Embedding based Graph Convolutional Network (KE-GCN)
KE-GCN combines the power of Graph Convolutional Network (GCN) in graph-based belief propagation and the strengths of advanced knowledge embedding methods.
Our theoretical analysis shows that KE-GCN offers an elegant unification of several well-known GCN methods as specific cases.
arXiv Detail & Related papers (2020-06-12T17:12:51Z) - Directed Graph Convolutional Network [15.879411956536885]
We extend spectral-based graph convolution to directed graphs by using first- and second-order proximity.
A new GCN model, called DGCN, is then designed to learn representations on the directed graph.
arXiv Detail & Related papers (2020-04-29T06:19:10Z) - Graphon Pooling in Graph Neural Networks [169.09536309161314]
Graph neural networks (GNNs) have been used effectively in different applications involving the processing of signals on irregular structures modeled by graphs.
We propose a new strategy for pooling and sampling on GNNs using graphons which preserves the spectral properties of the graph.
arXiv Detail & Related papers (2020-03-03T21:04:20Z) - LightGCN: Simplifying and Powering Graph Convolution Network for
Recommendation [100.76229017056181]
Graph Convolution Network (GCN) has become new state-of-the-art for collaborative filtering.
In this work, we aim to simplify the design of GCN to make it more concise and appropriate for recommendation.
We propose a new model named LightGCN, including only the most essential component in GCN -- neighborhood aggregation.
arXiv Detail & Related papers (2020-02-06T06:53:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.