Residual Hyperbolic Graph Convolution Networks
- URL: http://arxiv.org/abs/2412.03825v1
- Date: Thu, 05 Dec 2024 02:38:45 GMT
- Title: Residual Hyperbolic Graph Convolution Networks
- Authors: Yangkai Xue, Jindou Dai, Zhipeng Lu, Yuwei Wu, Yunde Jia,
- Abstract summary: Hyperbolic graph convolutional networks (HGCNs) have demonstrated representational capabilities of modeling hierarchical-structured graphs.
Over-smoothing may occur as the number of model layers increases, limiting the representation capabilities of most current HGCN models.
We propose residual hyperbolic graph convolutional networks (R-HGCNs) to address the over-smoothing problem.
- Score: 24.380072966869932
- License:
- Abstract: Hyperbolic graph convolutional networks (HGCNs) have demonstrated representational capabilities of modeling hierarchical-structured graphs. However, as in general GCNs, over-smoothing may occur as the number of model layers increases, limiting the representation capabilities of most current HGCN models. In this paper, we propose residual hyperbolic graph convolutional networks (R-HGCNs) to address the over-smoothing problem. We introduce a hyperbolic residual connection function to overcome the over-smoothing problem, and also theoretically prove the effectiveness of the hyperbolic residual function. Moreover, we use product manifolds and HyperDrop to facilitate the R-HGCNs. The distinctive features of the R-HGCNs are as follows: (1) The hyperbolic residual connection preserves the initial node information in each layer and adds a hyperbolic identity mapping to prevent node features from being indistinguishable. (2) Product manifolds in R-HGCNs have been set up with different origin points in different components to facilitate the extraction of feature information from a wider range of perspectives, which enhances the representing capability of R-HGCNs. (3) HyperDrop adds multiplicative Gaussian noise into hyperbolic representations, such that perturbations can be added to alleviate the over-fitting problem without deconstructing the hyperbolic geometry. Experiment results demonstrate the effectiveness of R-HGCNs under various graph convolution layers and different structures of product manifolds.
Related papers
- Point Cloud Denoising With Fine-Granularity Dynamic Graph Convolutional Networks [58.050130177241186]
Noise perturbations often corrupt 3-D point clouds, hindering downstream tasks such as surface reconstruction, rendering, and further processing.
This paper introduces finegranularity dynamic graph convolutional networks called GDGCN, a novel approach to denoising in 3-D point clouds.
arXiv Detail & Related papers (2024-11-21T14:19:32Z) - Continuous Geometry-Aware Graph Diffusion via Hyperbolic Neural PDE [36.48610732998552]
We introduce theoretical principles textite.g., field and flow, gradient, divergence, and diffusivity on a non-Euclidean manifold for HPDE integration.
We propose the Hyperbolic Graph Diffusion Equation (HGDE) -- a flexible vector flow function that can be integrated to obtain expressive hyperbolic node embeddings.
arXiv Detail & Related papers (2024-06-03T12:50:58Z) - DeepHGCN: Toward Deeper Hyperbolic Graph Convolutional Networks [21.605755985700615]
We propose DeepHGCN, the first deep multi-layer HGCN architecture with dramatically improved computational efficiency and substantially reduced over-smoothing.
DeepHGCN features two key innovations: (1) a novel hyperbolic feature transformation layer that enables fast and accurate linear mappings, and (2) techniques such as hyperbolic residual connections and regularization for both weights and features.
Extensive experiments demonstrate that DeepHGCN achieves significant improvements in link prediction and node classification tasks compared to both Euclidean and shallow hyperbolic GCN variants.
arXiv Detail & Related papers (2023-10-03T13:10:14Z) - HGCH: A Hyperbolic Graph Convolution Network Model for Heterogeneous Collaborative Graph Recommendation [11.651443951846668]
We propose an enhanced HGCN-based model for collaborative filtering that integrates diverse side information into a heterogeneous collaborative graph.
We evaluate HGCH on four real datasets, and the results show that HGCH achieves competitive results and outperforms leading baselines.
arXiv Detail & Related papers (2023-04-06T09:38:54Z) - Stable and Transferable Hyper-Graph Neural Networks [95.07035704188984]
We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
arXiv Detail & Related papers (2022-11-11T23:44:20Z) - Clenshaw Graph Neural Networks [14.8308791628821]
Graph Convolutional Networks (GCNs) are foundational methods for learning graph representations.
Existing residual connection techniques fail to make extensive use of underlying graph structure.
We introduce ClenshawGCN, a GNN model that employs the Clenshaw Summation algorithm to enhance the expressiveness of the GCN model.
arXiv Detail & Related papers (2022-10-29T06:32:39Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Lorentzian Graph Convolutional Networks [47.41609636856708]
We propose a novel hyperbolic graph convolutional network (LGCN) named Lorentzian graph convolutional network (LGCN)
LGCN rigorously guarantees the learned node features follow the hyperbolic geometry.
Experiments on six datasets show that LGCN performs better than the state-of-the-art methods.
arXiv Detail & Related papers (2021-04-15T14:14:25Z) - A Hyperbolic-to-Hyperbolic Graph Convolutional Network [46.80564170208473]
We propose a hyperbolic-to-hyperbolic graph convolutional network (H2H-GCN) that directly works on hyperbolic manifold.
The H2H-GCN achieves substantial improvements on the link prediction, node classification, and graph classification tasks.
arXiv Detail & Related papers (2021-04-14T16:09:27Z) - Cross-GCN: Enhancing Graph Convolutional Network with $k$-Order Feature
Interactions [153.6357310444093]
Graph Convolutional Network (GCN) is an emerging technique that performs learning and reasoning on graph data.
We argue that existing designs of GCN forgo modeling cross features, making GCN less effective for tasks or data where cross features are important.
We design a new operator named Cross-feature Graph Convolution, which explicitly models the arbitrary-order cross features with complexity linear to feature dimension and order size.
arXiv Detail & Related papers (2020-03-05T13:05:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.