The Logic of Graph Neural Networks
- URL: http://arxiv.org/abs/2104.14624v1
- Date: Thu, 29 Apr 2021 19:23:26 GMT
- Title: The Logic of Graph Neural Networks
- Authors: Martin Grohe
- Abstract summary: Graph neural networks (GNNs) are deep learning architectures for machine learning problems on graphs.
It has been shown that the expressiveness of GNNs can be characterised precisely by the Weisfeiler-Leman algorithms and by variable counting logics.
- Score: 0.9355115132408681
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) are deep learning architectures for machine
learning problems on graphs. It has recently been shown that the expressiveness
of GNNs can be characterised precisely by the combinatorial Weisfeiler-Leman
algorithms and by finite variable counting logics. The correspondence has even
led to new, higher-order GNNs corresponding to the WL algorithm in higher
dimensions.
The purpose of this paper is to explain these descriptive characterisations
of GNNs.
Related papers
- MAG-GNN: Reinforcement Learning Boosted Graph Neural Network [68.60884768323739]
A particular line of work proposed subgraph GNNs that use subgraph information to improve GNNs' expressivity and achieved great success.
Such effectivity sacrifices the efficiency of GNNs by enumerating all possible subgraphs.
We propose Magnetic Graph Neural Network (MAG-GNN), a reinforcement learning (RL) boosted GNN, to solve the problem.
arXiv Detail & Related papers (2023-10-29T20:32:21Z) - Explainability in subgraphs-enhanced Graph Neural Networks [12.526174412246107]
Subgraphs-enhanced Graph Neural Networks (SGNNs) have been introduced to enhance the expressive power of GNNs.
In this work, we adapt PGExplainer, one of the most recent explainers for GNNs, to SGNNs.
We show that our framework is successful in explaining the decision process of an SGNN on graph classification tasks.
arXiv Detail & Related papers (2022-09-16T13:39:10Z) - Graph Neural Networks are Dynamic Programmers [0.0]
Graph neural networks (GNNs) are claimed to align with dynamic programming (DP)
Here we show, using methods from theory and abstract algebra, that there exists an intricate connection between GNNs and DP.
arXiv Detail & Related papers (2022-03-29T13:27:28Z) - Edge-Level Explanations for Graph Neural Networks by Extending
Explainability Methods for Convolutional Neural Networks [33.20913249848369]
Graph Neural Networks (GNNs) are deep learning models that take graph data as inputs, and they are applied to various tasks such as traffic prediction and molecular property prediction.
We extend explainability methods for CNNs, such as Local Interpretable Model-Agnostic Explanations (LIME), Gradient-Based Saliency Maps, and Gradient-Weighted Class Activation Mapping (Grad-CAM) to GNNs.
The experimental results indicate that the LIME-based approach is the most efficient explainability method for multiple tasks in the real-world situation, outperforming even the state-of-the
arXiv Detail & Related papers (2021-11-01T06:27:29Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z) - Learning Graph Neural Networks with Approximate Gradient Descent [24.49427608361397]
Two types of graph neural networks (GNNs) are investigated, depending on whether labels are attached to nodes or graphs.
A comprehensive framework for designing and analyzing convergence of GNN training algorithms is developed.
The proposed algorithm guarantees a linear convergence rate to the underlying true parameters of GNNs.
arXiv Detail & Related papers (2020-12-07T02:54:48Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Higher-Order Explanations of Graph Neural Networks via Relevant Walks [3.1510406584101776]
Graph Neural Networks (GNNs) are a popular approach for predicting graph structured data.
In this paper, we show that GNNs can in fact be naturally explained using higher-order expansions.
We extract practically relevant insights on sentiment analysis of text data, structure-property relationships in quantum chemistry, and image classification.
arXiv Detail & Related papers (2020-06-05T17:59:14Z) - XGNN: Towards Model-Level Explanations of Graph Neural Networks [113.51160387804484]
Graphs neural networks (GNNs) learn node features by aggregating and combining neighbor information.
GNNs are mostly treated as black-boxes and lack human intelligible explanations.
We propose a novel approach, known as XGNN, to interpret GNNs at the model-level.
arXiv Detail & Related papers (2020-06-03T23:52:43Z) - Efficient Probabilistic Logic Reasoning with Graph Neural Networks [63.099999467118245]
Markov Logic Networks (MLNs) can be used to address many knowledge graph problems.
Inference in MLN is computationally intensive, making the industrial-scale application of MLN very difficult.
We propose a graph neural network (GNN) variant, named ExpressGNN, which strikes a nice balance between the representation power and the simplicity of the model.
arXiv Detail & Related papers (2020-01-29T23:34:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.