Alleviating Over-Smoothing via Aggregation over Compact Manifolds
- URL: http://arxiv.org/abs/2407.19231v1
- Date: Sat, 27 Jul 2024 11:02:12 GMT
- Title: Alleviating Over-Smoothing via Aggregation over Compact Manifolds
- Authors: Dongzhuoran Zhou, Hui Yang, Bo Xiong, Yue Ma, Evgeny Kharlamov,
- Abstract summary: Graph neural networks (GNNs) have achieved significant success in various applications.
Most GNNs learn the node features with information aggregation of its neighbors and feature transformation in each layer.
However, the node features become indistinguishable after many layers, leading to performance deterioration.
- Score: 19.559230417122826
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have achieved significant success in various applications. Most GNNs learn the node features with information aggregation of its neighbors and feature transformation in each layer. However, the node features become indistinguishable after many layers, leading to performance deterioration: a significant limitation known as over-smoothing. Past work adopted various techniques for addressing this issue, such as normalization and skip-connection of layer-wise output. After the study, we found that the information aggregations in existing work are all contracted aggregations, with the intrinsic property that features will inevitably converge to the same single point after many layers. To this end, we propose the aggregation over compacted manifolds method (ACM) that replaces the existing information aggregation with aggregation over compact manifolds, a special type of manifold, which avoids contracted aggregations. In this work, we theoretically analyze contracted aggregation and its properties. We also provide an extensive empirical evaluation that shows ACM can effectively alleviate over-smoothing and outperforms the state-of-the-art. The code can be found in https://github.com/DongzhuoranZhou/ACM.git.
Related papers
- Sequential Signal Mixing Aggregation for Message Passing Graph Neural Networks [2.7719338074999547]
We introduce Sequential Signal Mixing Aggregation (SSMA), a novel plug-and-play aggregation for MPGNNs.
SSMA treats the neighbor features as 2D discrete signals and sequentially convolves them, inherently enhancing the ability to mix features attributed to distinct neighbors.
We show that when combining SSMA with well-established MPGNN architectures, we achieve substantial performance gains across various benchmarks.
arXiv Detail & Related papers (2024-09-28T17:13:59Z) - AGNN: Alternating Graph-Regularized Neural Networks to Alleviate
Over-Smoothing [29.618952407794776]
We propose an Alternating Graph-regularized Neural Network (AGNN) composed of Graph Convolutional Layer (GCL) and Graph Embedding Layer (GEL)
GEL is derived from the graph-regularized optimization containing Laplacian embedding term, which can alleviate the over-smoothing problem.
AGNN is evaluated via a large number of experiments including performance comparison with some multi-layer or multi-order graph neural networks.
arXiv Detail & Related papers (2023-04-14T09:20:03Z) - FMGNN: Fused Manifold Graph Neural Network [102.61136611255593]
Graph representation learning has been widely studied and demonstrated effectiveness in various graph tasks.
We propose the Fused Manifold Graph Neural Network (NN), a novel GNN architecture that embeds graphs into different Manifolds during training.
Our experiments demonstrate that NN yields superior performance over strong baselines on the benchmarks of node classification and link prediction tasks.
arXiv Detail & Related papers (2023-04-03T15:38:53Z) - Feature Aggregation and Propagation Network for Camouflaged Object
Detection [42.33180748293329]
Camouflaged object detection (COD) aims to detect/segment camouflaged objects embedded in the environment.
Several COD methods have been developed, but they still suffer from unsatisfactory performance due to intrinsic similarities between foreground objects and background surroundings.
We propose a novel Feature Aggregation and propagation Network (FAP-Net) for camouflaged object detection.
arXiv Detail & Related papers (2022-12-02T05:54:28Z) - Multi-scale Feature Aggregation for Crowd Counting [84.45773306711747]
We propose a multi-scale feature aggregation network (MSFANet)
MSFANet consists of two feature aggregation modules: the short aggregation (ShortAgg) and the skip aggregation (SkipAgg)
arXiv Detail & Related papers (2022-08-10T10:23:12Z) - Attention-driven Graph Clustering Network [49.040136530379094]
We propose a novel deep clustering method named Attention-driven Graph Clustering Network (AGCN)
AGCN exploits a heterogeneous-wise fusion module to dynamically fuse the node attribute feature and the topological graph feature.
AGCN can jointly perform feature learning and cluster assignment in an unsupervised fashion.
arXiv Detail & Related papers (2021-08-12T02:30:38Z) - Clustering multilayer graphs with missing nodes [4.007017852999008]
Clustering is a fundamental problem in network analysis where the goal is to regroup nodes with similar connectivity profiles.
We propose a new framework that allows for layers to be defined on different sets of nodes.
arXiv Detail & Related papers (2021-03-04T18:56:59Z) - Multi-scale Interactive Network for Salient Object Detection [91.43066633305662]
We propose the aggregate interaction modules to integrate the features from adjacent levels.
To obtain more efficient multi-scale features, the self-interaction modules are embedded in each decoder unit.
Experimental results on five benchmark datasets demonstrate that the proposed method without any post-processing performs favorably against 23 state-of-the-art approaches.
arXiv Detail & Related papers (2020-07-17T15:41:37Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - Associating Multi-Scale Receptive Fields for Fine-grained Recognition [5.079292308180334]
We propose a novel cross-layer non-local (CNL) module to associate multi-scale receptive fields by two operations.
CNL computes correlations between features of a query layer and all response layers.
Our model builds spatial dependencies among multi-level layers and learns more discriminative features.
arXiv Detail & Related papers (2020-05-19T01:16:31Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.