On the Robustness of Graph Neural Diffusion to Topology Perturbations
- URL: http://arxiv.org/abs/2209.07754v2
- Date: Thu, 11 May 2023 04:50:47 GMT
- Title: On the Robustness of Graph Neural Diffusion to Topology Perturbations
- Authors: Yang Song, Qiyu Kang, Sijie Wang, Zhao Kai, Wee Peng Tay
- Abstract summary: We show that graph neural PDEs are intrinsically more robust against topology perturbation as compared to other GNNs.
We propose a general graph neural PDE framework based on which a new class of robust GNNs can be defined.
- Score: 30.284359808863588
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural diffusion on graphs is a novel class of graph neural networks that has
attracted increasing attention recently. The capability of graph neural partial
differential equations (PDEs) in addressing common hurdles of graph neural
networks (GNNs), such as the problems of over-smoothing and bottlenecks, has
been investigated but not their robustness to adversarial attacks. In this
work, we explore the robustness properties of graph neural PDEs. We empirically
demonstrate that graph neural PDEs are intrinsically more robust against
topology perturbation as compared to other GNNs. We provide insights into this
phenomenon by exploiting the stability of the heat semigroup under graph
topology perturbations. We discuss various graph diffusion operators and relate
them to existing graph neural PDEs. Furthermore, we propose a general graph
neural PDE framework based on which a new class of robust GNNs can be defined.
We verify that the new model achieves comparable state-of-the-art performance
on several benchmark datasets.
Related papers
- Coupling Graph Neural Networks with Fractional Order Continuous
Dynamics: A Robustness Study [24.950680319986486]
We rigorously investigate the robustness of graph neural fractional-order differential equation (FDE) models.
This framework extends beyond traditional graph neural (integer-order) ordinary differential equation (ODE) models by implementing the time-fractional Caputo derivative.
arXiv Detail & Related papers (2024-01-09T02:56:52Z) - Graph Neural Stochastic Differential Equations [3.568455515949288]
We present a novel model Graph Neural Differential Equations (Graph Neural SDEs)
This technique enhances the Graph Neural Ordinary Differential Equations (Graph Neural ODEs) by embedding randomness into data representation using Brownian motion.
We find that Latent Graph Neural SDEs surpass conventional models like Graph Convolutional Networks and Graph Neural ODEs, especially in confidence prediction.
arXiv Detail & Related papers (2023-08-23T09:20:38Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Topological Relational Learning on Graphs [2.4692806302088868]
Graph neural networks (GNNs) have emerged as a powerful tool for graph classification and representation learning.
We propose a novel topological relational inference (TRI) which allows for integrating higher-order graph information to GNNs.
We show that the new TRI-GNN outperforms all 14 state-of-the-art baselines on 6 out 7 graphs and exhibit higher robustness to perturbations.
arXiv Detail & Related papers (2021-10-29T04:03:27Z) - On the Stability of Graph Convolutional Neural Networks under Edge
Rewiring [22.58110328955473]
Graph neural networks are experiencing a surge of popularity within the machine learning community.
Despite this, their stability, i.e., their robustness to small perturbations in the input, is not yet well understood.
We develop an interpretable upper bound elucidating that graph neural networks are stable to rewiring between high degree nodes.
arXiv Detail & Related papers (2020-10-26T17:37:58Z) - Graph and graphon neural network stability [122.06927400759021]
Graph networks (GNNs) are learning architectures that rely on knowledge of the graph structure to generate meaningful representations of network data.
We analyze GNN stability using kernel objects called graphons.
arXiv Detail & Related papers (2020-10-23T16:55:56Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Understanding Graph Isomorphism Network for rs-fMRI Functional
Connectivity Analysis [49.05541693243502]
We develop a framework for analyzing fMRI data using the Graph Isomorphism Network (GIN)
One of the important contributions of this paper is the observation that the GIN is a dual representation of convolutional neural network (CNN) in the graph space.
We exploit CNN-based saliency map techniques for the GNN, which we tailor to the proposed GIN with one-hot encoding.
arXiv Detail & Related papers (2020-01-10T23:40:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.