Simplified PCNet with Robustness
- URL: http://arxiv.org/abs/2403.03676v1
- Date: Wed, 6 Mar 2024 12:57:48 GMT
- Title: Simplified PCNet with Robustness
- Authors: Bingheng Li, Xuanting Xie, Haoxiang Lei, Ruiyi Fang, and Zhao Kang
- Abstract summary: Possion-Charlier Network (PCNet) citeli2024pc, the previous work, allows graph representation to be learned from heterophily to homophily.
We simplify PCNet and enhance its robustness.
- Score: 5.127360270463981
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have garnered significant attention for their
success in learning the representation of homophilic or heterophilic graphs.
However, they cannot generalize well to real-world graphs with different levels
of homophily. In response, the Possion-Charlier Network (PCNet)
\cite{li2024pc}, the previous work, allows graph representation to be learned
from heterophily to homophily. Although PCNet alleviates the heterophily issue,
there remain some challenges in further improving the efficacy and efficiency.
In this paper, we simplify PCNet and enhance its robustness. We first extend
the filter order to continuous values and reduce its parameters. Two variants
with adaptive neighborhood sizes are implemented. Theoretical analysis shows
our model's robustness to graph structure perturbations or adversarial attacks.
We validate our approach through semi-supervised learning tasks on various
datasets representing both homophilic and heterophilic graphs.
Related papers
- The Heterophilic Snowflake Hypothesis: Training and Empowering GNNs for Heterophilic Graphs [59.03660013787925]
We introduce the Heterophily Snowflake Hypothesis and provide an effective solution to guide and facilitate research on heterophilic graphs.
Our observations show that our framework acts as a versatile operator for diverse tasks.
It can be integrated into various GNN frameworks, boosting performance in-depth and offering an explainable approach to choosing the optimal network depth.
arXiv Detail & Related papers (2024-06-18T12:16:00Z) - Provable Filter for Real-world Graph Clustering [11.7278692671308]
A principled way to handle practical graphs is urgently needed.
We construct two graphs that are highly homophilic and heterophilic, respectively.
We validate our approach through extensive experiments on both homophilic and heterophilic graphs.
arXiv Detail & Related papers (2024-03-06T12:37:49Z) - Universally Robust Graph Neural Networks by Preserving Neighbor
Similarity [5.660584039688214]
We introduce a novel robust model termed NSPGNN which incorporates a dual-kNN graphs pipeline to supervise the neighbor similarity-guided propagation.
Experiments on both homophilic and heterophilic graphs validate the universal robustness of NSPGNN compared to the state-of-the-art methods.
arXiv Detail & Related papers (2024-01-18T06:57:29Z) - PC-Conv: Unifying Homophily and Heterophily with Two-fold Filtering [7.444454681645474]
We propose a two-fold filtering mechanism to extract homophily in heterophilic graphs and vice versa.
In particular, we extend the graph heat equation to perform heterophilic aggregation of global information from a long distance.
To further exploit information at multiple orders, we introduce a powerful graph PC-Conv and its instantiation PCNet.
arXiv Detail & Related papers (2023-12-22T05:04:28Z) - Demystifying Structural Disparity in Graph Neural Networks: Can One Size
Fit All? [61.35457647107439]
Most real-world homophilic and heterophilic graphs are comprised of a mixture of nodes in both homophilic and heterophilic structural patterns.
We provide evidence that Graph Neural Networks(GNNs) on node classification typically perform admirably on homophilic nodes.
We then propose a rigorous, non-i.i.d PAC-Bayesian generalization bound for GNNs, revealing reasons for the performance disparity.
arXiv Detail & Related papers (2023-06-02T07:46:20Z) - Resisting Graph Adversarial Attack via Cooperative Homophilous
Augmentation [60.50994154879244]
Recent studies show that Graph Neural Networks are vulnerable and easily fooled by small perturbations.
In this work, we focus on the emerging but critical attack, namely, Graph Injection Attack.
We propose a general defense framework CHAGNN against GIA through cooperative homophilous augmentation of graph data and model.
arXiv Detail & Related papers (2022-11-15T11:44:31Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Is Homophily a Necessity for Graph Neural Networks? [50.959340355849896]
Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks.
GNNs are widely believed to work well due to the homophily assumption ("like attracts like"), and fail to generalize to heterophilous graphs where dissimilar nodes connect.
Recent works design new architectures to overcome such heterophily-related limitations, citing poor baseline performance and new architecture improvements on a few heterophilous graph benchmark datasets as evidence for this notion.
In our experiments, we empirically find that standard graph convolutional networks (GCNs) can actually achieve better performance than
arXiv Detail & Related papers (2021-06-11T02:44:00Z) - Meta-path Free Semi-supervised Learning for Heterogeneous Networks [16.641434334366227]
Graph neural networks (GNNs) have been widely used in representation learning on graphs and achieved superior performance in tasks such as node classification.
In this paper, we propose simple and effective graph neural networks for heterogeneous graph, excluding the use of meta-paths.
arXiv Detail & Related papers (2020-10-18T06:01:58Z) - Graph Neural Networks with Heterophily [40.23690407583509]
We propose a novel framework called CPGNN that generalizes GNNs for graphs with either homophily or heterophily.
We show that replacing the compatibility matrix in our framework with the identity (which represents pure homophily) reduces to GCN.
arXiv Detail & Related papers (2020-09-28T18:29:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.