Dissecting the Diffusion Process in Linear Graph Convolutional Networks
- URL: http://arxiv.org/abs/2102.10739v1
- Date: Mon, 22 Feb 2021 02:45:59 GMT
- Title: Dissecting the Diffusion Process in Linear Graph Convolutional Networks
- Authors: Yifei Wang, Yisen Wang, Jiansheng Yang, Zhouchen Lin
- Abstract summary: Graph Convolutional Networks (GCNs) have attracted more and more attention in recent years.
Recent works show that a linear GCN can achieve comparable performance to the original non-linear GCN.
We propose Decoupled Graph Convolution (DGC) that decouples the terminal time and the feature propagation steps.
- Score: 71.30132908130581
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Convolutional Networks (GCNs) have attracted more and more attentions
in recent years. A typical GCN layer consists of a linear feature propagation
step and a nonlinear transformation step. Recent works show that a linear GCN
can achieve comparable performance to the original non-linear GCN while being
much more computationally efficient. In this paper, we dissect the feature
propagation steps of linear GCNs from a perspective of continuous graph
diffusion, and analyze why linear GCNs fail to benefit from more propagation
steps. Following that, we propose Decoupled Graph Convolution (DGC) that
decouples the terminal time and the feature propagation steps, making it more
flexible and capable of exploiting a very large number of feature propagation
steps. Experiments demonstrate that our proposed DGC improves linear GCNs by a
large margin and makes them competitive with many modern variants of non-linear
GCNs.
Related papers
- Point Cloud Denoising With Fine-Granularity Dynamic Graph Convolutional Networks [58.050130177241186]
Noise perturbations often corrupt 3-D point clouds, hindering downstream tasks such as surface reconstruction, rendering, and further processing.
This paper introduces finegranularity dynamic graph convolutional networks called GDGCN, a novel approach to denoising in 3-D point clouds.
arXiv Detail & Related papers (2024-11-21T14:19:32Z) - L^2GC:Lorentzian Linear Graph Convolutional Networks for Node Classification [12.69417276887153]
We propose a novel framework for Lorentzian linear GCN.
We map the learned features of graph nodes into hyperbolic space.
We then perform a Lorentzian linear feature transformation to capture the underlying tree-like structure of data.
arXiv Detail & Related papers (2024-03-10T02:16:13Z) - LightGCN: Evaluated and Enhanced [0.0]
LightGCN enables linear propagation of embeddings, enhancing performance.
We reproduce the original findings, assess LightGCN's robustness on diverse datasets and metrics, and explore Graph Diffusion as an augmentation of signal propagation in LightGCN.
arXiv Detail & Related papers (2023-12-17T15:18:18Z) - Old can be Gold: Better Gradient Flow can Make Vanilla-GCNs Great Again [96.4999517230259]
We provide a new perspective of gradient flow to understand the substandard performance of deep GCNs.
We propose to use gradient-guided dynamic rewiring of vanilla-GCNs with skip connections.
Our methods significantly boost their performance to comfortably compete and outperform many fancy state-of-the-art methods.
arXiv Detail & Related papers (2022-10-14T21:30:25Z) - Optimization-Induced Graph Implicit Nonlinear Diffusion [64.39772634635273]
We propose a new kind of graph convolution variants, called Graph Implicit Diffusion (GIND)
GIND implicitly has access to infinite hops of neighbors while adaptively aggregating features with nonlinear diffusion to prevent over-smoothing.
We show that the learned representation can be formalized as the minimizer of an explicit convex optimization objective.
arXiv Detail & Related papers (2022-06-29T06:26:42Z) - Orthogonal Graph Neural Networks [53.466187667936026]
Graph neural networks (GNNs) have received tremendous attention due to their superiority in learning node representations.
stacking more convolutional layers significantly decreases the performance of GNNs.
We propose a novel Ortho-GConv, which could generally augment the existing GNN backbones to stabilize the model training and improve the model's generalization performance.
arXiv Detail & Related papers (2021-09-23T12:39:01Z) - Directed Graph Convolutional Network [15.879411956536885]
We extend spectral-based graph convolution to directed graphs by using first- and second-order proximity.
A new GCN model, called DGCN, is then designed to learn representations on the directed graph.
arXiv Detail & Related papers (2020-04-29T06:19:10Z) - Scattering GCN: Overcoming Oversmoothness in Graph Convolutional
Networks [0.0]
Graph convolutional networks (GCNs) have shown promising results in processing graph data by extracting structure-aware features.
Here, we propose to augment conventional GCNs with geometric scattering transforms and residual convolutions.
The former enables band-pass filtering of graph signals, thus alleviating the so-called oversmoothing often encountered in GCNs.
arXiv Detail & Related papers (2020-03-18T18:03:08Z) - Cross-GCN: Enhancing Graph Convolutional Network with $k$-Order Feature
Interactions [153.6357310444093]
Graph Convolutional Network (GCN) is an emerging technique that performs learning and reasoning on graph data.
We argue that existing designs of GCN forgo modeling cross features, making GCN less effective for tasks or data where cross features are important.
We design a new operator named Cross-feature Graph Convolution, which explicitly models the arbitrary-order cross features with complexity linear to feature dimension and order size.
arXiv Detail & Related papers (2020-03-05T13:05:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.