Simplifying Graph Convolutional Networks with Redundancy-Free Neighbors
- URL: http://arxiv.org/abs/2504.13426v2
- Date: Mon, 21 Apr 2025 05:10:14 GMT
- Title: Simplifying Graph Convolutional Networks with Redundancy-Free Neighbors
- Authors: Jielong Lu, Zhihao Wu, Zhiling Cai, Yueyang Pi, Shiping Wang,
- Abstract summary: We analyze the intrinsic message passing mechanism of Graph Convolutional Networks (GCNs)<n>This repeated reliance on low-order neighbors leads to redundant information aggregation, a phenomenon we term over-aggregation.<n>Our analysis demonstrates that over-aggregation not only introduces significant redundancy but also serves as the fundamental cause of over-smoothing in GCNs.
- Score: 8.793707476780304
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, Graph Convolutional Networks (GCNs) have gained popularity for their exceptional ability to process graph-structured data. Existing GCN-based approaches typically employ a shallow model architecture due to the over-smoothing phenomenon. Current approaches to mitigating over-smoothing primarily involve adding supplementary components to GCN architectures, such as residual connections and random edge-dropping strategies. However, these improvements toward deep GCNs have achieved only limited success. In this work, we analyze the intrinsic message passing mechanism of GCNs and identify a critical issue: messages originating from high-order neighbors must traverse through low-order neighbors to reach the target node. This repeated reliance on low-order neighbors leads to redundant information aggregation, a phenomenon we term over-aggregation. Our analysis demonstrates that over-aggregation not only introduces significant redundancy but also serves as the fundamental cause of over-smoothing in GCNs.
Related papers
- TSC: A Simple Two-Sided Constraint against Over-Smoothing [17.274727377858873]
We introduce a simple Two-Sided Constraint (TSC) for Graph Convolutional Neural Network (GCN)
The random masking acts on the representation matrix's columns to regulate the degree of information aggregation from neighbors.
The contrastive constraint, applied to the representation matrix's rows, enhances the discriminability of the nodes.
arXiv Detail & Related papers (2024-08-06T12:52:03Z) - Demystifying Oversmoothing in Attention-Based Graph Neural Networks [23.853636836842604]
Oversmoothing in Graph Neural Networks (GNNs) refers to the phenomenon where increasing network depth leads to homogeneous node representations.
Previous work has established that Graph Convolutional Networks (GCNs) exponentially lose expressive power.
It remains controversial whether the graph attention mechanism can mitigate oversmoothing.
arXiv Detail & Related papers (2023-05-25T14:31:59Z) - Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural
Networks [52.566735716983956]
We propose a graph gradual pruning framework termed CGP to dynamically prune GNNs.
Unlike LTH-based methods, the proposed CGP approach requires no re-training, which significantly reduces the computation costs.
Our proposed strategy greatly improves both training and inference efficiency while matching or even exceeding the accuracy of existing methods.
arXiv Detail & Related papers (2022-07-18T14:23:31Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Lightweight Graph Convolutional Networks with Topologically Consistent
Magnitude Pruning [12.18340575383456]
Graph convolution networks (GCNs) are currently mainstream in learning with irregular data.
In this paper, we devise a novel method for lightweight GCN design.
Our proposed approach parses and selectsworks with the highest magnitudes while guaranteeing their topological consistency.
arXiv Detail & Related papers (2022-03-25T12:34:11Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - SStaGCN: Simplified stacking based graph convolutional networks [6.742080381542375]
Graph convolutional network (GCN) is a powerful model studied broadly in various graph structural data learning tasks.<n>We propose a novel GCN called SStaGCN (Simplified stacking based GCN) by utilizing the ideas of stacking and aggregation.<n>We show that SStaGCN can efficiently mitigate the over-smoothing problem of GCN.
arXiv Detail & Related papers (2021-11-16T05:00:08Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - AM-GCN: Adaptive Multi-channel Graph Convolutional Networks [85.0332394224503]
We study whether Graph Convolutional Networks (GCNs) can optimally integrate node features and topological structures in a complex graph with rich information.
We propose an adaptive multi-channel graph convolutional networks for semi-supervised classification (AM-GCN)
Our experiments show that AM-GCN extracts the most correlated information from both node features and topological structures substantially.
arXiv Detail & Related papers (2020-07-05T08:16:03Z) - Scattering GCN: Overcoming Oversmoothness in Graph Convolutional
Networks [0.0]
Graph convolutional networks (GCNs) have shown promising results in processing graph data by extracting structure-aware features.
Here, we propose to augment conventional GCNs with geometric scattering transforms and residual convolutions.
The former enables band-pass filtering of graph signals, thus alleviating the so-called oversmoothing often encountered in GCNs.
arXiv Detail & Related papers (2020-03-18T18:03:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.