Graph Laplacians, Riemannian Manifolds and their Machine-Learning
- URL: http://arxiv.org/abs/2006.16619v1
- Date: Tue, 30 Jun 2020 09:16:56 GMT
- Title: Graph Laplacians, Riemannian Manifolds and their Machine-Learning
- Authors: Yang-Hui He, Shing-Tung Yau
- Abstract summary: We apply some of the latest techniques in data science such as supervised and unsupervised machine-learning and topological data analysis to the Wolfram database of some 8000 finite graphs.
We find that neural classifiers, regressors and networks can perform, with high efficiently and accuracy, a multitude of tasks ranging from recognizing graph Ricci-flatness, to predicting the spectral gap, to detecting the presence of Hamiltonian cycles.
- Score: 2.258160413679475
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Laplacians as well as related spectral inequalities and (co-)homology
provide a foray into discrete analogues of Riemannian manifolds, providing a
rich interplay between combinatorics, geometry and theoretical physics. We
apply some of the latest techniques in data science such as supervised and
unsupervised machine-learning and topological data analysis to the Wolfram
database of some 8000 finite graphs in light of studying these correspondences.
Encouragingly, we find that neural classifiers, regressors and networks can
perform, with high efficiently and accuracy, a multitude of tasks ranging from
recognizing graph Ricci-flatness, to predicting the spectral gap, to detecting
the presence of Hamiltonian cycles, etc.
Related papers
- Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Transfer operators on graphs: Spectral clustering and beyond [1.147633309847809]
We show that spectral clustering of undirected graphs can be interpreted in terms of eigenfunctions of the Koopman operator.
We propose novel clustering algorithms for directed graphs based on generalized transfer operators.
arXiv Detail & Related papers (2023-05-19T15:52:08Z) - On the Expressivity of Persistent Homology in Graph Learning [13.608942872770855]
Persistent homology, a technique from computational topology, has recently shown strong empirical performance in the context of graph classification.
This paper provides a brief introduction to persistent homology in the context of graphs, as well as a theoretical discussion and empirical analysis of its expressivity for graph learning tasks.
arXiv Detail & Related papers (2023-02-20T08:19:19Z) - Latent Graph Inference using Product Manifolds [0.0]
We generalize the discrete Differentiable Graph Module (dDGM) for latent graph learning.
Our novel approach is tested on a wide range of datasets, and outperforms the original dDGM model.
arXiv Detail & Related papers (2022-11-26T22:13:06Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Graph Networks with Spectral Message Passing [1.0742675209112622]
We introduce the Spectral Graph Network, which applies message passing to both the spatial and spectral domains.
Our results show that the Spectral GN promotes efficient training, reaching high performance with fewer training iterations despite having more parameters.
arXiv Detail & Related papers (2020-12-31T21:33:17Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.