The logic of rational graph neural networks
- URL: http://arxiv.org/abs/2310.13139v8
- Date: Tue, 13 Aug 2024 17:12:03 GMT
- Title: The logic of rational graph neural networks
- Authors: Sammy Khalife,
- Abstract summary: We prove that some GC2 queries of depth $3$ cannot be expressed by GNNs with any rational activation function.
This shows that not all non-polynomial activation functions confer GNNs maximal expressivity.
We also present a rational subfragment of the first order logic (RGC2), and prove that rational GNNs can express RGC2 queries uniformly over all graphs.
- Score: 0.7614628596146602
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The expressivity of Graph Neural Networks (GNNs) can be described via appropriate fragments of the first order logic. Any query of the two variable fragment of graded modal logic (GC2) interpreted over labeled graphs can be expressed using a Rectified Linear Unit (ReLU) GNN whose size does not grow with graph input sizes [Barcelo & Al., 2020]. Conversely, a GNN expresses at most a query of GC2, for any choice of activation function. In this article, we prove that some GC2 queries of depth $3$ cannot be expressed by GNNs with any rational activation function. This shows that not all non-polynomial activation functions confer GNNs maximal expressivity, answering a open question formulated by [Grohe, 2021]. This result is also in contrast with the efficient universal approximation properties of rational feedforward neural networks investigated by [Boull\'e & Al., 2020]. We also present a rational subfragment of the first order logic (RGC2), and prove that rational GNNs can express RGC2 queries uniformly over all graphs.
Related papers
- Is uniform expressivity too restrictive? Towards efficient expressivity of graph neural networks [0.6445605125467574]
Graph Neural Network (GNN) can express a query without the parameters depending on the size of the input graphs.
We show that many GNNs can still efficiently express GC2 queries in a way that the number of parameters remains logarithmic on the maximal degree of the input graphs.
arXiv Detail & Related papers (2024-10-02T18:09:12Z) - Logical Distillation of Graph Neural Networks [47.859911892875346]
We present a logic based interpretable model for learning on graphs and an algorithm to distill this model from a Graph Neural Network (GNN)
Recent results have shown connections between the expressivity of GNNs and the two-variable fragment of first-order logic with counting quantifiers (C2)
arXiv Detail & Related papers (2024-06-11T10:18:58Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.
We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Calibrate and Boost Logical Expressiveness of GNN Over Multi-Relational
and Temporal Graphs [8.095679736030146]
We investigate $mathcalFOC$NN, a fragment of first-order logic with two variables and counting quantifiers.
We propose a simple graph transformation technique, akin to a preprocessing step, which can be executed in linear time.
Our results consistently demonstrate that R$2$-GNN with the graph transformation outperforms the baseline methods on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-11-03T00:33:24Z) - The Descriptive Complexity of Graph Neural Networks [2.6728900378310514]
We prove that the graph queries that can be computed by a bounded-depth family of GNNs are definable in the guarded GFO+C fragment of first-order logic with counting and with built-in relations.
We show that queries computable by a single GNN with piecewise linear activations and rational weights are definable in GFO+C without built-in relations.
arXiv Detail & Related papers (2023-03-08T14:32:59Z) - Some Might Say All You Need Is Sum [2.226803104060345]
The expressivity of Graph Neural Networks (GNNs) is dependent on the aggregation functions they employ.
We prove that basic functions, which can be computed exactly by Mean or Max GNNs, are inapproximable by any Sum GNN.
arXiv Detail & Related papers (2023-02-22T19:01:52Z) - On the approximation capability of GNNs in node
classification/regression tasks [4.141514895639094]
Graph Neural Networks (GNNs) are a broad class of connectionist models for graph processing.
We show that GNNs are universal approximators in probability for node classification/regression tasks.
arXiv Detail & Related papers (2021-06-16T17:46:51Z) - Bi-GCN: Binary Graph Convolutional Network [57.733849700089955]
We propose a Binary Graph Convolutional Network (Bi-GCN), which binarizes both the network parameters and input node features.
Our Bi-GCN can reduce the memory consumption by an average of 30x for both the network parameters and input data, and accelerate the inference speed by an average of 47x.
arXiv Detail & Related papers (2020-10-15T07:26:23Z) - The Surprising Power of Graph Neural Networks with Random Node
Initialization [54.4101931234922]
Graph neural networks (GNNs) are effective models for representation learning on relational data.
Standard GNNs are limited in their expressive power, as they cannot distinguish beyond the capability of the Weisfeiler-Leman graph isomorphism.
In this work, we analyze the expressive power of GNNs with random node (RNI)
We prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties.
arXiv Detail & Related papers (2020-10-02T19:53:05Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Can Graph Neural Networks Count Substructures? [53.256112515435355]
We study the power of graph neural networks (GNNs) via their ability to count attributed graph substructures.
We distinguish between two types of substructure counting: inducedsubgraph-count and subgraphcount-count, and both positive and negative answers for popular GNN architectures.
arXiv Detail & Related papers (2020-02-10T18:53:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.