Dual-Frequency Filtering Self-aware Graph Neural Networks for Homophilic and Heterophilic Graphs
- URL: http://arxiv.org/abs/2411.11284v1
- Date: Mon, 18 Nov 2024 04:57:05 GMT
- Title: Dual-Frequency Filtering Self-aware Graph Neural Networks for Homophilic and Heterophilic Graphs
- Authors: Yachao Yang, Yanfeng Sun, Jipeng Guo, Junbin Gao, Shaofan Wang, Fujiao Ju, Baocai Yin,
- Abstract summary: We propose Dual-Frequency Filtering Self-aware Graph Neural Networks (DFGNN)
DFGNN integrates low-pass and high-pass filters to extract smooth and detailed topological features.
It dynamically adjusts filtering ratios to accommodate both homophilic and heterophilic graphs.
- Score: 60.82508765185161
- License:
- Abstract: Graph Neural Networks (GNNs) have excelled in handling graph-structured data, attracting significant research interest. However, two primary challenges have emerged: interference between topology and attributes distorting node representations, and the low-pass filtering nature of most GNNs leading to the oversight of valuable high-frequency information in graph signals. These issues are particularly pronounced in heterophilic graphs. To address these challenges, we propose Dual-Frequency Filtering Self-aware Graph Neural Networks (DFGNN). DFGNN integrates low-pass and high-pass filters to extract smooth and detailed topological features, using frequency-specific constraints to minimize noise and redundancy in the respective frequency bands. The model dynamically adjusts filtering ratios to accommodate both homophilic and heterophilic graphs. Furthermore, DFGNN mitigates interference by aligning topological and attribute representations through dynamic correspondences between their respective frequency bands, enhancing overall model performance and expressiveness. Extensive experiments conducted on benchmark datasets demonstrate that DFGNN outperforms state-of-the-art methods in classification performance, highlighting its effectiveness in handling both homophilic and heterophilic graphs.
Related papers
- Node-wise Filtering in Graph Neural Networks: A Mixture of Experts Approach [58.8524608686851]
Graph Neural Networks (GNNs) have proven to be highly effective for node classification tasks across diverse graph structural patterns.
Traditionally, GNNs employ a uniform global filter, typically a low-pass filter for homophilic graphs and a high-pass filter for heterophilic graphs.
We introduce a novel GNN framework Node-MoE that utilizes a mixture of experts to adaptively select the appropriate filters for different nodes.
arXiv Detail & Related papers (2024-06-05T17:12:38Z) - Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - Universally Robust Graph Neural Networks by Preserving Neighbor
Similarity [5.660584039688214]
We introduce a novel robust model termed NSPGNN which incorporates a dual-kNN graphs pipeline to supervise the neighbor similarity-guided propagation.
Experiments on both homophilic and heterophilic graphs validate the universal robustness of NSPGNN compared to the state-of-the-art methods.
arXiv Detail & Related papers (2024-01-18T06:57:29Z) - Shape-aware Graph Spectral Learning [36.63516222161871]
Spectral Graph Neural Networks (GNNs) are gaining attention for their ability to surpass the limitations of message-passing GNNs.
Some works empirically show that the preferred graph frequency is related to the graph homophily level.
This relationship between graph frequency and graphs with homophily/heterophily has not been systematically analyzed and considered in existing spectral GNNs.
We propose shape-aware regularization on a Newton Interpolation-based spectral filter that can learn an arbitrary spectral filter and incorporate prior knowledge about the desired shape of the corresponding homophily level.
arXiv Detail & Related papers (2023-10-16T04:57:30Z) - GPatcher: A Simple and Adaptive MLP Model for Alleviating Graph
Heterophily [15.93465948768545]
We demystify the impact of graph heterophily on graph neural networks (GNNs) filters.
We propose a simple yet powerful GNN named GPatcher by leveraging the patch-Mixer architectures.
Our model demonstrates outstanding performance on node classification compared with popular homophily GNNs and state-of-the-art heterophily GNNs.
arXiv Detail & Related papers (2023-06-25T20:57:35Z) - Node-oriented Spectral Filtering for Graph Neural Networks [38.0315325181726]
Graph neural networks (GNNs) have shown remarkable performance on homophilic graph data.
In general, learning a universal spectral filter on the graph from the global perspective may still suffer from great difficulty in adapting to the variation of local patterns.
We propose the node-oriented spectral filtering for graph neural network (namely NFGNN)
arXiv Detail & Related papers (2022-12-07T14:15:28Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs [6.018995094882323]
Graph neural networks (GNNs) have been extensively studied for prediction tasks on graphs.
Most GNNs assume local homophily, i.e., strong similarities in localneighborhoods.
We propose a flexible GNN model, which is capable of handling any graphs without beingrestricted by their underlying homophily.
arXiv Detail & Related papers (2021-03-26T00:35:36Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.