On the Expressivity of Persistent Homology in Graph Learning
- URL: http://arxiv.org/abs/2302.09826v3
- Date: Mon, 3 Jun 2024 08:20:31 GMT
- Title: On the Expressivity of Persistent Homology in Graph Learning
- Authors: Rubén Ballester, Bastian Rieck,
- Abstract summary: Persistent homology, a technique from computational topology, has recently shown strong empirical performance in the context of graph classification.
This paper provides a brief introduction to persistent homology in the context of graphs, as well as a theoretical discussion and empirical analysis of its expressivity for graph learning tasks.
- Score: 13.608942872770855
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Persistent homology, a technique from computational topology, has recently shown strong empirical performance in the context of graph classification. Being able to capture long range graph properties via higher-order topological features, such as cycles of arbitrary length, in combination with multi-scale topological descriptors, has improved predictive performance for data sets with prominent topological structures, such as molecules. At the same time, the theoretical properties of persistent homology have not been formally assessed in this context. This paper intends to bridge the gap between computational topology and graph machine learning by providing a brief introduction to persistent homology in the context of graphs, as well as a theoretical discussion and empirical analysis of its expressivity for graph learning tasks.
Related papers
- Topograph: An efficient Graph-Based Framework for Strictly Topology Preserving Image Segmentation [78.54656076915565]
Topological correctness plays a critical role in many image segmentation tasks.
Most networks are trained using pixel-wise loss functions, such as Dice, neglecting topological accuracy.
We propose a novel, graph-based framework for topologically accurate image segmentation.
arXiv Detail & Related papers (2024-11-05T16:20:14Z) - TopER: Topological Embeddings in Graph Representation Learning [8.052380377159398]
Topological Evolution Rate (TopER) is a low-dimensional embedding approach grounded in topological data analysis.
TopER simplifies a key topological approach, Persistent Homology, by calculating the evolution rate of graph substructures.
Our models achieve or surpass state-of-the-art results across molecular, biological, and social network datasets in tasks such as classification, clustering, and visualization.
arXiv Detail & Related papers (2024-10-02T17:31:33Z) - The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - Homomorphism Counts for Graph Neural Networks: All About That Basis [8.25219440625445]
We argue for a more fine-grained approach, which incorporates the homomorphism counts of all structures in the basis'' of the target pattern.
This yields strictly more expressive architectures without incurring any additional overhead in terms of computational complexity.
arXiv Detail & Related papers (2024-02-13T16:57:06Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Weisfeiler and Lehman Go Paths: Learning Topological Features via Path Complexes [4.23480641508611]
Graph Neural Networks (GNNs) are theoretically bounded by the 1-Weisfeiler-Lehman test.
Our study presents a novel perspective by focusing on simple paths within graphs during the topological message-passing process.
arXiv Detail & Related papers (2023-08-13T19:45:20Z) - From axioms over graphs to vectors, and back again: evaluating the
properties of graph-based ontology embeddings [78.217418197549]
One approach to generating embeddings is by introducing a set of nodes and edges for named entities and logical axioms structure.
Methods that embed in graphs (graph projections) have different properties related to the type of axioms they utilize.
arXiv Detail & Related papers (2023-03-29T08:21:49Z) - A Multi-scale Graph Signature for Persistence Diagrams based on Return
Probabilities of Random Walks [1.745838188269503]
We explore the use of a family of multi-scale graph signatures to enhance the robustness of topological features.
We propose a deep learning architecture to handle this set input.
Experiments on benchmark graph classification datasets demonstrate that our proposed architecture outperforms other persistent homology-based methods.
arXiv Detail & Related papers (2022-09-28T17:30:27Z) - Generalization Guarantee of Training Graph Convolutional Networks with
Graph Topology Sampling [83.77955213766896]
Graph convolutional networks (GCNs) have recently achieved great empirical success in learning graphstructured data.
To address its scalability issue, graph topology sampling has been proposed to reduce the memory and computational cost of training Gs.
This paper provides first theoretical justification of graph topology sampling in training (up to) three-layer GCNs.
arXiv Detail & Related papers (2022-07-07T21:25:55Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Graph Laplacians, Riemannian Manifolds and their Machine-Learning [2.258160413679475]
We apply some of the latest techniques in data science such as supervised and unsupervised machine-learning and topological data analysis to the Wolfram database of some 8000 finite graphs.
We find that neural classifiers, regressors and networks can perform, with high efficiently and accuracy, a multitude of tasks ranging from recognizing graph Ricci-flatness, to predicting the spectral gap, to detecting the presence of Hamiltonian cycles.
arXiv Detail & Related papers (2020-06-30T09:16:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.