A Modal Logic for Explaining some Graph Neural Networks
- URL: http://arxiv.org/abs/2307.05150v1
- Date: Tue, 11 Jul 2023 10:13:25 GMT
- Title: A Modal Logic for Explaining some Graph Neural Networks
- Authors: Pierre Nunn and Fran\c{c}ois Schwarzentruber
- Abstract summary: We show that each formula can be transformed into an equivalent graph neural network (GNN)
We also show that the satisfiability problem is decidable.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we propose a modal logic in which counting modalities appear
in linear inequalities. We show that each formula can be transformed into an
equivalent graph neural network (GNN). We also show that each GNN can be
transformed into a formula. We show that the satisfiability problem is
decidable. We also discuss some variants that are in PSPACE.
Related papers
- Logical Distillation of Graph Neural Networks [47.859911892875346]
We present a logic based interpretable model for learning on graphs and an algorithm to distill this model from a Graph Neural Network (GNN)
Recent results have shown connections between the expressivity of GNNs and the two-variable fragment of first-order logic with counting quantifiers (C2)
arXiv Detail & Related papers (2024-06-11T10:18:58Z) - A Logic for Reasoning About Aggregate-Combine Graph Neural Networks [11.313331046805365]
We show that each formula can be transformed into an equivalent graph neural network (GNN)
We also show that the satisfiability problem is PSPACE-complete.
arXiv Detail & Related papers (2024-04-30T21:16:38Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - The Logic of Graph Neural Networks [0.9355115132408681]
Graph neural networks (GNNs) are deep learning architectures for machine learning problems on graphs.
It has been shown that the expressiveness of GNNs can be characterised precisely by the Weisfeiler-Leman algorithms and by variable counting logics.
arXiv Detail & Related papers (2021-04-29T19:23:26Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Expressive Power of Invariant and Equivariant Graph Neural Networks [10.419350129060598]
We show that Folklore Graph Neural Networks (FGNN) are the most expressive architectures proposed so far for a given tensor order.
FGNNs are able to learn how to solve the problem, leading to much better average performances than existing algorithms.
arXiv Detail & Related papers (2020-06-28T16:35:45Z) - Evaluating Logical Generalization in Graph Neural Networks [59.70452462833374]
We study the task of logical generalization using graph neural networks (GNNs)
Our benchmark suite, GraphLog, requires that learning algorithms perform rule induction in different synthetic logics.
We find that the ability for models to generalize and adapt is strongly determined by the diversity of the logical rules they encounter during training.
arXiv Detail & Related papers (2020-03-14T05:45:55Z) - Random Features Strengthen Graph Neural Networks [40.60905158071766]
Graph neural networks (GNNs) are powerful machine learning models for various graph learning tasks.
In this paper, we demonstrate that GNNs become powerful just by adding a random feature to each node.
We show that the addition of random features enables GNNs to solve various problems that normal GNNs, including the graph convolutional networks (GCNs) and graph isomorphism networks (GINs) cannot solve.
arXiv Detail & Related papers (2020-02-08T12:47:29Z) - Efficient Probabilistic Logic Reasoning with Graph Neural Networks [63.099999467118245]
Markov Logic Networks (MLNs) can be used to address many knowledge graph problems.
Inference in MLN is computationally intensive, making the industrial-scale application of MLN very difficult.
We propose a graph neural network (GNN) variant, named ExpressGNN, which strikes a nice balance between the representation power and the simplicity of the model.
arXiv Detail & Related papers (2020-01-29T23:34:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.