HD-GCN:A Hybrid Diffusion Graph Convolutional Network
- URL: http://arxiv.org/abs/2303.17966v1
- Date: Fri, 31 Mar 2023 11:12:25 GMT
- Title: HD-GCN:A Hybrid Diffusion Graph Convolutional Network
- Authors: Zhi Yang, Kang Li, Haitao Gan, Zhongwei Huang, Ming Shi
- Abstract summary: We introduce a new framework for graph convolutional networks called Hybrid Diffusion-based Graph Convolutional Network (HD-GCN)
HD-GCN utilizes hybrid diffusion by combining information diffusion between neighborhood nodes in the feature space and adjacent nodes in the adjacency matrix.
We evaluate the performance of HD-GCN on three well-known citation network datasets.
- Score: 13.906335023159002
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The information diffusion performance of GCN and its variant models is
limited by the adjacency matrix, which can lower their performance. Therefore,
we introduce a new framework for graph convolutional networks called Hybrid
Diffusion-based Graph Convolutional Network (HD-GCN) to address the limitations
of information diffusion caused by the adjacency matrix. In the HD-GCN
framework, we initially utilize diffusion maps to facilitate the diffusion of
information among nodes that are adjacent to each other in the feature space.
This allows for the diffusion of information between similar points that may
not have an adjacent relationship. Next, we utilize graph convolution to
further propagate information among adjacent nodes after the diffusion maps,
thereby enabling the spread of information among similar nodes that are
adjacent in the graph. Finally, we employ the diffusion distances obtained
through the use of diffusion maps to regularize and constrain the predicted
labels of training nodes. This regularization method is then applied to the
HD-GCN training, resulting in a smoother classification surface. The model
proposed in this paper effectively overcomes the limitations of information
diffusion imposed only by the adjacency matrix. HD-GCN utilizes hybrid
diffusion by combining information diffusion between neighborhood nodes in the
feature space and adjacent nodes in the adjacency matrix. This method allows
for more comprehensive information propagation among nodes, resulting in
improved model performance. We evaluated the performance of DM-GCN on three
well-known citation network datasets and the results showed that the proposed
framework is more effective than several graph-based semi-supervised learning
methods.
Related papers
- ReDiSC: A Reparameterized Masked Diffusion Model for Scalable Node Classification with Structured Predictions [64.17845687013434]
We propose ReDiSC, a structured diffusion model for structured node classification.<n>We show that ReDiSC achieves superior or highly competitive performance compared to state-of-the-art GNN, label propagation, and diffusion-based baselines.<n> Notably, ReDiSC scales effectively to large-scale datasets on which previous structured diffusion methods fail due to computational constraints.
arXiv Detail & Related papers (2025-07-19T04:46:53Z) - Cooperative Sheaf Neural Networks [3.862247454265944]
We show that existing sheaf diffusion methods fail to achieve cooperative behavior due to the lack of message directionality.<n>We propose Cooperative Sheaf Neural Networks (CSNNs) to overcome this limitation.<n>Our experiments show that CSNN presents overall better performance compared to prior art on sheaf diffusion as well as cooperative graph neural networks.
arXiv Detail & Related papers (2025-07-01T10:42:41Z) - Exploring Molecule Generation Using Latent Space Graph Diffusion [0.0]
Generating molecular graphs is a challenging task due to their discrete nature and the competitive objectives involved.
For molecular graphs, graph neural networks (GNNs) as a diffusion backbone have achieved impressive results.
Latent space diffusion, where diffusion occurs in a low-dimensional space via an autoencoder, has demonstrated computational efficiency.
arXiv Detail & Related papers (2025-01-07T10:54:44Z) - Flexible Diffusion Scopes with Parameterized Laplacian for Heterophilic Graph Learning [7.775158500331812]
We propose a new class of parameterized Laplacian matrices, which provably offers more flexibility in controlling the diffusion distance between nodes.
We show that the parameters in the Laplacian enable flexibility of the diffusion scopes.
We propose two GNNs with flexible diffusion scopes: namely the Diffusionized based Graph Convolutional Networks (PDGCN) and Graph Attention Networks (PD-GAT)
arXiv Detail & Related papers (2024-09-15T22:52:46Z) - Graph Neural Aggregation-diffusion with Metastability [4.040326569845733]
Continuous graph neural models based on differential equations have expanded the architecture of graph neural networks (GNNs)
We propose GRADE inspired by graph aggregation-diffusion equations, which includes the delicate balance between nonlinear diffusion and aggregation induced by interaction potentials.
We prove that GRADE achieves competitive performance across various benchmarks and alleviates the over-smoothing issue in GNNs.
arXiv Detail & Related papers (2024-03-29T15:05:57Z) - Learning to Approximate Adaptive Kernel Convolution on Graphs [4.434835769977399]
We propose a diffusion learning framework, where the range of feature aggregation is controlled by the scale of a diffusion kernel.
Our model is tested on various standard for node-wise classification for the state-of-the-art datasets performance.
It is also validated on a real-world brain network data for graph classifications to demonstrate its practicality for Alzheimer classification.
arXiv Detail & Related papers (2024-01-22T10:57:11Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Accelerating Scalable Graph Neural Network Inference with Node-Adaptive
Propagation [80.227864832092]
Graph neural networks (GNNs) have exhibited exceptional efficacy in a diverse array of applications.
The sheer size of large-scale graphs presents a significant challenge to real-time inference with GNNs.
We propose an online propagation framework and two novel node-adaptive propagation methods.
arXiv Detail & Related papers (2023-10-17T05:03:00Z) - DiGress: Discrete Denoising diffusion for graph generation [79.13904438217592]
DiGress is a discrete denoising diffusion model for generating graphs with categorical node and edge attributes.
It achieves state-of-the-art performance on molecular and non-molecular datasets, with up to 3x validity improvement.
It is also the first model to scale to the large GuacaMol dataset containing 1.3M drug-like molecules.
arXiv Detail & Related papers (2022-09-29T12:55:03Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Diffusion-GAN: Training GANs with Diffusion [135.24433011977874]
Generative adversarial networks (GANs) are challenging to train stably.
We propose Diffusion-GAN, a novel GAN framework that leverages a forward diffusion chain to generate instance noise.
We show that Diffusion-GAN can produce more realistic images with higher stability and data efficiency than state-of-the-art GANs.
arXiv Detail & Related papers (2022-06-05T20:45:01Z) - Graph Neural Diffusion Networks for Semi-supervised Learning [6.376489604292251]
Graph Convolutional Networks (GCN) is a pioneering model for graph-based semi-supervised learning.
We propose a new graph neural network called neural-Nets (for Graph Neural Diffusion Networks) that exploits the local and global neighborhood information.
The adoption of neural networks makes neural diffusions adaptable to different datasets.
arXiv Detail & Related papers (2022-01-24T14:07:56Z) - AMA-GCN: Adaptive Multi-layer Aggregation Graph Convolutional Network
for Disease Prediction [20.19380805655623]
We propose an encoder that automatically selects the appropriate phenotypic measures according to their spatial distribution.
We also propose a novel graph convolution network architecture using multi-layer aggregation mechanism.
arXiv Detail & Related papers (2021-06-16T12:13:23Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Graph Highway Networks [77.38665506495553]
Graph Convolution Networks (GCN) are widely used in learning graph representations due to their effectiveness and efficiency.
They suffer from the notorious over-smoothing problem, in which the learned representations converge to alike vectors when many layers are stacked.
We propose Graph Highway Networks (GHNet) which utilize gating units to balance the trade-off between homogeneity and heterogeneity in the GCN learning process.
arXiv Detail & Related papers (2020-04-09T16:26:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.