Homomorphism Counts for Graph Neural Networks: All About That Basis
- URL: http://arxiv.org/abs/2402.08595v5
- Date: Mon, 10 Jun 2024 06:14:34 GMT
- Title: Homomorphism Counts for Graph Neural Networks: All About That Basis
- Authors: Emily Jin, Michael Bronstein, İsmail İlkan Ceylan, Matthias Lanzinger,
- Abstract summary: We argue for a more fine-grained approach, which incorporates the homomorphism counts of all structures in the basis'' of the target pattern.
This yields strictly more expressive architectures without incurring any additional overhead in terms of computational complexity.
- Score: 8.25219440625445
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A large body of work has investigated the properties of graph neural networks and identified several limitations, particularly pertaining to their expressive power. Their inability to count certain patterns (e.g., cycles) in a graph lies at the heart of such limitations, since many functions to be learned rely on the ability of counting such patterns. Two prominent paradigms aim to address this limitation by enriching the graph features with subgraph or homomorphism pattern counts. In this work, we show that both of these approaches are sub-optimal in a certain sense and argue for a more fine-grained approach, which incorporates the homomorphism counts of all structures in the ``basis'' of the target pattern. This yields strictly more expressive architectures without incurring any additional overhead in terms of computational complexity compared to existing approaches. We prove a series of theoretical results on node-level and graph-level motif parameters and empirically validate them on standard benchmark datasets.
Related papers
- Generalization of Graph Neural Networks through the Lens of Homomorphism [7.223313563198697]
We propose to study the generalization of Graph Neural Networks (GNNs) through a novel perspective - analyzing the entropy of graph homomorphism.
By linking graph homomorphism with information-theoretic measures, we derive generalization bounds for both graph and node classifications.
These bounds are capable of capturing subtleties inherent in various graph structures, including but not limited to paths, cycles and cliques.
arXiv Detail & Related papers (2024-03-10T03:51:59Z) - Structural Node Embeddings with Homomorphism Counts [2.0131144893314232]
homomorphism counts capture local structural information.
We experimentally show the effectiveness of homomorphism count based node embeddings.
Our approach capitalises on the efficient computability of graph homomorphism counts for bounded treewidth graph classes.
arXiv Detail & Related papers (2023-08-29T13:14:53Z) - Weisfeiler and Lehman Go Paths: Learning Topological Features via Path Complexes [4.23480641508611]
Graph Neural Networks (GNNs) are theoretically bounded by the 1-Weisfeiler-Lehman test.
Our study presents a novel perspective by focusing on simple paths within graphs during the topological message-passing process.
arXiv Detail & Related papers (2023-08-13T19:45:20Z) - On the Expressivity of Persistent Homology in Graph Learning [13.608942872770855]
Persistent homology, a technique from computational topology, has recently shown strong empirical performance in the context of graph classification.
This paper provides a brief introduction to persistent homology in the context of graphs, as well as a theoretical discussion and empirical analysis of its expressivity for graph learning tasks.
arXiv Detail & Related papers (2023-02-20T08:19:19Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - An Interpretable Graph Generative Model with Heterophily [38.59200985962146]
We propose the first edge-independent graph generative model that is expressive enough to capture heterophily.
Our experiments demonstrate the effectiveness of our model for a variety of important application tasks.
arXiv Detail & Related papers (2021-11-04T17:34:39Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.