The Exact Class of Graph Functions Generated by Graph Neural Networks
- URL: http://arxiv.org/abs/2202.08833v1
- Date: Thu, 17 Feb 2022 18:54:27 GMT
- Title: The Exact Class of Graph Functions Generated by Graph Neural Networks
- Authors: Mohammad Fereydounian, Hamed Hassani, Javid Dadashkarimi, Amin Karbasi
- Abstract summary: Graph Neural Network (GNN) whose output is identical to the graph function?
In this paper, we fully answer this question and characterize the class of graph problems that can be represented by GNNs.
We show that this condition can be efficiently verified by checking quadratically many constraints.
- Score: 43.25172578943894
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Given a graph function, defined on an arbitrary set of edge weights and node
features, does there exist a Graph Neural Network (GNN) whose output is
identical to the graph function? In this paper, we fully answer this question
and characterize the class of graph problems that can be represented by GNNs.
We identify an algebraic condition, in terms of the permutation of edge weights
and node features, which proves to be necessary and sufficient for a graph
problem to lie within the reach of GNNs. Moreover, we show that this condition
can be efficiently verified by checking quadratically many constraints. Note
that our refined characterization on the expressive power of GNNs are
orthogonal to those theoretical results showing equivalence between GNNs and
Weisfeiler-Lehman graph isomorphism heuristic. For instance, our
characterization implies that many natural graph problems, such as min-cut
value, max-flow value, and max-clique size, can be represented by a GNN. In
contrast, and rather surprisingly, there exist very simple graphs for which no
GNN can correctly find the length of the shortest paths between all nodes. Note
that finding shortest paths is one of the most classical problems in Dynamic
Programming (DP). Thus, the aforementioned negative example highlights the
misalignment between DP and GNN, even though (conceptually) they follow very
similar iterative procedures. Finally, we support our theoretical results by
experimental simulations.
Related papers
- Uplifting the Expressive Power of Graph Neural Networks through Graph
Partitioning [3.236774847052122]
We study the expressive power of graph neural networks through the lens of graph partitioning.
We introduce a novel GNN architecture, namely Graph Partitioning Neural Networks (GPNNs)
arXiv Detail & Related papers (2023-12-14T06:08:35Z) - Robust Graph Neural Network based on Graph Denoising [10.564653734218755]
Graph Neural Networks (GNNs) have emerged as a notorious alternative to address learning problems dealing with non-Euclidean datasets.
This work proposes a robust implementation of GNNs that explicitly accounts for the presence of perturbations in the observed topology.
arXiv Detail & Related papers (2023-12-11T17:43:57Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Training Graph Neural Networks on Growing Stochastic Graphs [114.75710379125412]
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data.
We propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
arXiv Detail & Related papers (2022-10-27T16:00:45Z) - Transferability Properties of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) are provably successful at learning representations from data supported on moderate-scale graphs.
We study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
Our results show that (i) the transference error decreases with the graph size, and (ii) that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity.
arXiv Detail & Related papers (2021-12-09T00:08:09Z) - On the approximation capability of GNNs in node
classification/regression tasks [4.141514895639094]
Graph Neural Networks (GNNs) are a broad class of connectionist models for graph processing.
We show that GNNs are universal approximators in probability for node classification/regression tasks.
arXiv Detail & Related papers (2021-06-16T17:46:51Z) - Towards Scale-Invariant Graph-related Problem Solving by Iterative
Homogeneous Graph Neural Networks [39.370875358317946]
Current graph neural networks (GNNs) lack generalizability with respect to scales (graph sizes, graph diameters, edge weights, etc.) when solving many graph analysis problems.
We propose several extensions to address the issue. First, inspired by the dependency of the number of iteration of common graph theory algorithms on graph size, we learn to terminate the message passing process in GNNs adaptively according to the progress.
Second, inspired by the fact that many graph theory algorithms are homogeneous with respect to graph weights, we introduce homogeneous transformation layers that are universal homogeneous function approximators, to convert ordinary G
arXiv Detail & Related papers (2020-10-26T12:57:28Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Expressive Power of Invariant and Equivariant Graph Neural Networks [10.419350129060598]
We show that Folklore Graph Neural Networks (FGNN) are the most expressive architectures proposed so far for a given tensor order.
FGNNs are able to learn how to solve the problem, leading to much better average performances than existing algorithms.
arXiv Detail & Related papers (2020-06-28T16:35:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.