Flexible Diffusion Scopes with Parameterized Laplacian for Heterophilic Graph Learning
- URL: http://arxiv.org/abs/2409.09888v1
- Date: Sun, 15 Sep 2024 22:52:46 GMT
- Title: Flexible Diffusion Scopes with Parameterized Laplacian for Heterophilic Graph Learning
- Authors: Qincheng Lu, Jiaqi Zhu, Sitao Luan, Xiao-Wen Chang,
- Abstract summary: We propose a new class of parameterized Laplacian matrices, which provably offers more flexibility in controlling the diffusion distance between nodes.
We show that the parameters in the Laplacian enable flexibility of the diffusion scopes.
We propose two GNNs with flexible diffusion scopes: namely the Diffusionized based Graph Convolutional Networks (PDGCN) and Graph Attention Networks (PD-GAT)
- Score: 7.775158500331812
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The ability of Graph Neural Networks (GNNs) to capture long-range and global topology information is limited by the scope of conventional graph Laplacian, leading to unsatisfactory performance on some datasets, particularly on heterophilic graphs. To address this limitation, we propose a new class of parameterized Laplacian matrices, which provably offers more flexibility in controlling the diffusion distance between nodes than the conventional graph Laplacian, allowing long-range information to be adaptively captured through diffusion on graph. Specifically, we first prove that the diffusion distance and spectral distance on graph have an order-preserving relationship. With this result, we demonstrate that the parameterized Laplacian can accelerate the diffusion of long-range information, and the parameters in the Laplacian enable flexibility of the diffusion scopes. Based on the theoretical results, we propose topology-guided rewiring mechanism to capture helpful long-range neighborhood information for heterophilic graphs. With this mechanism and the new Laplacian, we propose two GNNs with flexible diffusion scopes: namely the Parameterized Diffusion based Graph Convolutional Networks (PD-GCN) and Graph Attention Networks (PD-GAT). Synthetic experiments reveal the high correlations between the parameters of the new Laplacian and the performance of parameterized GNNs under various graph homophily levels, which verifies that our new proposed GNNs indeed have the ability to adjust the parameters to adaptively capture the global information for different levels of heterophilic graphs. They also outperform the state-of-the-art (SOTA) models on 6 out of 7 real-world benchmark datasets, which further confirms their superiority.
Related papers
- ReDiSC: A Reparameterized Masked Diffusion Model for Scalable Node Classification with Structured Predictions [64.17845687013434]
We propose ReDiSC, a structured diffusion model for structured node classification.<n>We show that ReDiSC achieves superior or highly competitive performance compared to state-of-the-art GNN, label propagation, and diffusion-based baselines.<n> Notably, ReDiSC scales effectively to large-scale datasets on which previous structured diffusion methods fail due to computational constraints.
arXiv Detail & Related papers (2025-07-19T04:46:53Z) - GRAIN: Multi-Granular and Implicit Information Aggregation Graph Neural Network for Heterophilous Graphs [11.458759345322832]
Granular and Implicit Graph Network (GRAIN) is a novel GNN model specifically designed for heterophilous graphs.
GRAIN enhances node embeddings by aggregating multi-view information at various levels and incorporating implicit data from distant, non-neighboring nodes.
We also introduce an adaptive graph information aggregator that efficiently combines multi-granularity and implicit data, significantly improving node representation quality.
arXiv Detail & Related papers (2025-04-09T07:36:44Z) - Large-Scale Spectral Graph Neural Networks via Laplacian Sparsification: Technical Report [21.288230563135055]
We propose a novel graph spectral sparsification method to approximate the propagation patterns of spectral Graph Neural Networks (GNNs)
Our method allows the application of linear layers on the input node features, enabling end-to-end training as well as the handling of raw features.
arXiv Detail & Related papers (2025-01-08T15:36:19Z) - Dual-Frequency Filtering Self-aware Graph Neural Networks for Homophilic and Heterophilic Graphs [60.82508765185161]
We propose Dual-Frequency Filtering Self-aware Graph Neural Networks (DFGNN)
DFGNN integrates low-pass and high-pass filters to extract smooth and detailed topological features.
It dynamically adjusts filtering ratios to accommodate both homophilic and heterophilic graphs.
arXiv Detail & Related papers (2024-11-18T04:57:05Z) - Representation Learning on Heterophilic Graph with Directional
Neighborhood Attention [8.493802098034255]
Graph Attention Network (GAT) is one of the most popular Graph Neural Network (GNN) architecture.
GAT lacks the ability to capture long-range and global graph information, leading to unsatisfactory performance on some datasets.
We propose Directional Graph Attention Network (DGAT) to combine the feature-based attention with the global directional information extracted from the graph topology.
arXiv Detail & Related papers (2024-03-03T10:59:16Z) - Accelerating Scalable Graph Neural Network Inference with Node-Adaptive
Propagation [80.227864832092]
Graph neural networks (GNNs) have exhibited exceptional efficacy in a diverse array of applications.
The sheer size of large-scale graphs presents a significant challenge to real-time inference with GNNs.
We propose an online propagation framework and two novel node-adaptive propagation methods.
arXiv Detail & Related papers (2023-10-17T05:03:00Z) - Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Implicit Graph Neural Diffusion Networks: Convergence, Generalization,
and Over-Smoothing [7.984586585987328]
Implicit Graph Neural Networks (GNNs) have achieved significant success in addressing graph learning problems.
We introduce a geometric framework for designing implicit graph diffusion layers based on a parameterized graph Laplacian operator.
We show how implicit GNN layers can be viewed as the fixed-point equation of a Dirichlet energy minimization problem.
arXiv Detail & Related papers (2023-08-07T05:22:33Z) - Re-Think and Re-Design Graph Neural Networks in Spaces of Continuous
Graph Diffusion Functionals [7.6435511285856865]
Graph neural networks (GNNs) are widely used in domains like social networks and biological systems.
locality assumption of GNNs hampers their ability to capture long-range dependencies and global patterns in graphs.
We propose a new inductive bias based on variational analysis, drawing inspiration from the Brachchronistoe problem.
arXiv Detail & Related papers (2023-07-01T04:44:43Z) - Stable and Transferable Hyper-Graph Neural Networks [95.07035704188984]
We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
arXiv Detail & Related papers (2022-11-11T23:44:20Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph
Neural Networks [25.59092732148598]
Heterogeneous graph neural networks (GNNs) achieve strong performance on node classification tasks in a semi-supervised learning setting.
We propose a novel heterogeneous GNN architecture in which layers are derived from optimization steps that descend a novel relation-aware energy function.
arXiv Detail & Related papers (2022-06-22T13:48:08Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.