Neural Approximation of Extended Persistent Homology on Graphs
- URL: http://arxiv.org/abs/2201.12032v1
- Date: Fri, 28 Jan 2022 10:54:45 GMT
- Title: Neural Approximation of Extended Persistent Homology on Graphs
- Authors: Zuoyu Yan, Tengfei Ma, Liangcai Gao, Zhi Tang, Yusu Wang, Chao Chen
- Abstract summary: We propose a novel learning method to compute extended persistence diagrams on graphs.
Our method is also efficient; on large and dense graphs, we accelerate the computation by nearly 100 times.
- Score: 23.606830663387775
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Persistent homology is a widely used theory in topological data analysis. In
the context of graph learning, topological features based on persistent
homology have been used to capture potentially high-order structural
information so as to augment existing graph neural network methods. However,
computing extended persistent homology summaries remains slow for large and
dense graphs, especially since in learning applications one has to carry out
this computation potentially many times. Inspired by recent success in neural
algorithmic reasoning, we propose a novel learning method to compute extended
persistence diagrams on graphs. The proposed neural network aims to simulate a
specific algorithm and learns to compute extended persistence diagrams for new
graphs efficiently. Experiments on approximating extended persistence diagrams
and several downstream graph representation learning tasks demonstrate the
effectiveness of our method. Our method is also efficient; on large and dense
graphs, we accelerate the computation by nearly 100 times.
Related papers
- NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - A Comprehensive Survey on Deep Graph Representation Learning [26.24869157855632]
Graph representation learning aims to encode high-dimensional sparse graph-structured data into low-dimensional dense vectors.
Traditional methods have limited model capacity which limits the learning performance.
Deep graph representation learning has shown great potential and advantages over shallow (traditional) methods.
arXiv Detail & Related papers (2023-04-11T08:23:52Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Persistence Homology for Link Prediction: An Interactive View [15.068319518015421]
Link prediction is an important learning task for graph-structured data.
We propose a novel topological approach to characterize interactions between two nodes.
We also propose a graph neural network method that outperforms state-of-the-arts on different benchmarks.
arXiv Detail & Related papers (2021-02-20T04:33:59Z) - Discrete Graph Structure Learning for Forecasting Multiple Time Series [14.459541930646205]
Time series forecasting is an extensively studied subject in statistics, economics, and computer science.
In this work, we propose learning the structure simultaneously with a graph neural network (GNN) if the graph is unknown.
Empirical evaluations show that our method is simpler, more efficient, and better performing than a recently proposed bilevel learning approach for graph structure learning.
arXiv Detail & Related papers (2021-01-18T03:36:33Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - A Heterogeneous Graph with Factual, Temporal and Logical Knowledge for
Question Answering Over Dynamic Contexts [81.4757750425247]
We study question answering over a dynamic textual environment.
We develop a graph neural network over the constructed graph, and train the model in an end-to-end manner.
arXiv Detail & Related papers (2020-04-25T04:53:54Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.