Rethinking Explaining Graph Neural Networks via Non-parametric Subgraph
Matching
- URL: http://arxiv.org/abs/2301.02780v2
- Date: Wed, 1 Nov 2023 06:35:30 GMT
- Title: Rethinking Explaining Graph Neural Networks via Non-parametric Subgraph
Matching
- Authors: Fang Wu, Siyuan Li, Xurui Jin, Yinghui Jiang, Dragomir Radev,
Zhangming Niu, Stan Z. Li
- Abstract summary: We propose a novel non-parametric subgraph matching framework, dubbed MatchExplainer, to explore explanatory subgraphs.
It couples the target graph with other counterpart instances and identifies the most crucial joint substructure by minimizing the node corresponding-based distance.
Experiments on synthetic and real-world datasets show the effectiveness of our MatchExplainer by outperforming all state-of-the-art parametric baselines with significant margins.
- Score: 68.35685422301613
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The success of graph neural networks (GNNs) provokes the question about
explainability: ``Which fraction of the input graph is the most determinant of
the prediction?'' Particularly, parametric explainers prevail in existing
approaches because of their more robust capability to decipher the black-box
(i.e., target GNNs). In this paper, based on the observation that graphs
typically share some common motif patterns, we propose a novel non-parametric
subgraph matching framework, dubbed MatchExplainer, to explore explanatory
subgraphs. It couples the target graph with other counterpart instances and
identifies the most crucial joint substructure by minimizing the node
corresponding-based distance. Moreover, we note that present graph sampling or
node-dropping methods usually suffer from the false positive sampling problem.
To alleviate this issue, we designed a new augmentation paradigm named
MatchDrop. It takes advantage of MatchExplainer to fix the most informative
portion of the graph and merely operates graph augmentations on the rest less
informative part. Extensive experiments on synthetic and real-world datasets
show the effectiveness of our MatchExplainer by outperforming all
state-of-the-art parametric baselines with significant margins. Results also
demonstrate that MatchDrop is a general scheme to be equipped with GNNs for
enhanced performance. The code is available at:
https://github.com/smiles724/MatchExplainer.
Related papers
- NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Reinforced Causal Explainer for Graph Neural Networks [112.57265240212001]
Explainability is crucial for probing graph neural networks (GNNs)
We propose a reinforcement learning agent, Reinforced Causal Explainer (RC-Explainer)
RC-Explainer generates faithful and concise explanations, and has a better power to unseen graphs.
arXiv Detail & Related papers (2022-04-23T09:13:25Z) - Deconfounded Training for Graph Neural Networks [98.06386851685645]
We present a new paradigm of decon training (DTP) that better mitigates the confounding effect and latches on the critical information.
Specifically, we adopt the attention modules to disentangle the critical subgraph and trivial subgraph.
It allows GNNs to capture a more reliable subgraph whose relation with the label is robust across different distributions.
arXiv Detail & Related papers (2021-12-30T15:22:35Z) - Neighborhood Random Walk Graph Sampling for Regularized Bayesian Graph
Convolutional Neural Networks [0.6236890292833384]
In this paper, we propose a novel algorithm called Bayesian Graph Convolutional Network using Neighborhood Random Walk Sampling (BGCN-NRWS)
BGCN-NRWS uses a Markov Chain Monte Carlo (MCMC) based graph sampling algorithm utilizing graph structure, reduces overfitting by using a variational inference layer, and yields consistently competitive classification results compared to the state-of-the-art in semi-supervised node classification.
arXiv Detail & Related papers (2021-12-14T20:58:27Z) - Partial Graph Reasoning for Neural Network Regularization [26.793648333908905]
Dropout,as a commonly used regularization technique, disables neuron ac-tivations during network optimization.
We present DropGraph that learns regularization function by constructing a stand-alone graph from the backbone features.
This add-on graph regularizes the network during training and can be completely skipped during inference.
arXiv Detail & Related papers (2021-06-03T12:57:01Z) - On Explainability of Graph Neural Networks via Subgraph Explorations [48.56936527708657]
We propose a novel method, known as SubgraphX, to explain graph neural networks (GNNs)
Our work represents the first attempt to explain GNNs via identifying subgraphs explicitly.
arXiv Detail & Related papers (2021-02-09T22:12:26Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.