Transfer operators on graphs: Spectral clustering and beyond
- URL: http://arxiv.org/abs/2305.11766v2
- Date: Wed, 14 Feb 2024 08:31:50 GMT
- Title: Transfer operators on graphs: Spectral clustering and beyond
- Authors: Stefan Klus, Maia Trower
- Abstract summary: We show that spectral clustering of undirected graphs can be interpreted in terms of eigenfunctions of the Koopman operator.
We propose novel clustering algorithms for directed graphs based on generalized transfer operators.
- Score: 1.147633309847809
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphs and networks play an important role in modeling and analyzing complex
interconnected systems such as transportation networks, integrated circuits,
power grids, citation graphs, and biological and artificial neural networks.
Graph clustering algorithms can be used to detect groups of strongly connected
vertices and to derive coarse-grained models. We define transfer operators such
as the Koopman operator and the Perron-Frobenius operator on graphs, study
their spectral properties, introduce Galerkin projections of these operators,
and illustrate how reduced representations can be estimated from data. In
particular, we show that spectral clustering of undirected graphs can be
interpreted in terms of eigenfunctions of the Koopman operator and propose
novel clustering algorithms for directed graphs based on generalized transfer
operators. We demonstrate the efficacy of the resulting algorithms on several
benchmark problems and provide different interpretations of clusters.
Related papers
- Higher-order Clustering and Pooling for Graph Neural Networks [77.47617360812023]
Graph Neural Networks achieve state-of-the-art performance on a plethora of graph classification tasks.
HoscPool is a clustering-based graph pooling operator that captures higher-order information hierarchically.
We evaluate HoscPool on graph classification tasks and its clustering component on graphs with ground-truth community structure.
arXiv Detail & Related papers (2022-09-02T09:17:10Z) - Tuning the Geometry of Graph Neural Networks [0.7614628596146599]
spatial graph convolution operators have been heralded as key to the success of Graph Neural Networks (GNNs)
We show that this aggregation operator is in fact tunable, and explicit regimes in which certain choices of operators -- and therefore, embedding geometries -- might be more appropriate.
arXiv Detail & Related papers (2022-07-12T23:28:03Z) - Nonbacktracking spectral clustering of nonuniform hypergraphs [2.408714894793063]
We study spectral clustering for nonuniform hypergraphs based on the hypergraph nonbacktracking operator.
We propose an alternating algorithm for inference in a hypergraph blockmodel via linearized belief-propagation.
arXiv Detail & Related papers (2022-04-27T01:14:06Z) - Koopman-based spectral clustering of directed and time-evolving graphs [0.3655021726150368]
spectral clustering algorithms for undirected graphs are well established and have been successfully applied to unsupervised machine learning problems.
However, clustering directed graphs remains notoriously difficult and there is no universally accepted definition of clusters in directed graphs.
We derive clustering algorithms for directed and time-evolving graphs using relationships between Laplacians and transfer operators.
The resulting clusters can be interpreted as coherent sets, which play an important role in the analysis of transport and mixing processes in fluid flows.
arXiv Detail & Related papers (2022-04-06T17:33:24Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Graph Laplacians, Riemannian Manifolds and their Machine-Learning [2.258160413679475]
We apply some of the latest techniques in data science such as supervised and unsupervised machine-learning and topological data analysis to the Wolfram database of some 8000 finite graphs.
We find that neural classifiers, regressors and networks can perform, with high efficiently and accuracy, a multitude of tasks ranging from recognizing graph Ricci-flatness, to predicting the spectral gap, to detecting the presence of Hamiltonian cycles.
arXiv Detail & Related papers (2020-06-30T09:16:56Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.