Learning graphons from data: Random walks, transfer operators, and spectral clustering
- URL: http://arxiv.org/abs/2507.18147v1
- Date: Thu, 24 Jul 2025 07:29:03 GMT
- Title: Learning graphons from data: Random walks, transfer operators, and spectral clustering
- Authors: Stefan Klus, Jason J. Bramburger,
- Abstract summary: We make an explicit link between the underlying process of a signal and a random walk process on a graphon.<n>We show that it is also possible to reconstruct transition probability and densities, if the random walk process is reversible.<n>The resulting data-driven methods are applied to a variety of synthetic and real-world signals, including daily average temperatures and stock index values.
- Score: 0.9714447724811842
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many signals evolve in time as a stochastic process, randomly switching between states over discretely sampled time points. Here we make an explicit link between the underlying stochastic process of a signal that can take on a bounded continuum of values and a random walk process on a graphon. Graphons are infinite-dimensional objects that represent the limit of convergent sequences of graphs whose size tends to infinity. We introduce transfer operators, such as the Koopman and Perron--Frobenius operators, associated with random walk processes on graphons and then illustrate how these operators can be estimated from signal data and how their eigenvalues and eigenfunctions can be used for detecting clusters, thereby extending conventional spectral clustering methods from graphs to graphons. Furthermore, we show that it is also possible to reconstruct transition probability densities and, if the random walk process is reversible, the graphon itself using only the signal. The resulting data-driven methods are applied to a variety of synthetic and real-world signals, including daily average temperatures and stock index values.
Related papers
- Graph-Dictionary Signal Model for Sparse Representations of Multivariate Data [49.77103348208835]
We define a novel Graph-Dictionary signal model, where a finite set of graphs characterizes relationships in data distribution through a weighted sum of their Laplacians.
We propose a framework to infer the graph dictionary representation from observed data, along with a bilinear generalization of the primal-dual splitting algorithm to solve the learning problem.
We exploit graph-dictionary representations in a motor imagery decoding task on brain activity data, where we classify imagined motion better than standard methods.
arXiv Detail & Related papers (2024-11-08T17:40:43Z) - Generating Graphs via Spectral Diffusion [48.70458395826864]
We present GGSD, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.<n>An extensive set of experiments on both synthetic and real-world graphs demonstrates the strengths of our model against state-of-the-art alternatives.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - Sampling and Uniqueness Sets in Graphon Signal Processing [125.12453307797]
We study the properties of sampling sets on families of large graphs by leveraging the theory of graphons and graph limits.<n>We exploit the convergence results to provide an algorithm that obtains approximately close to optimal sampling sets.
arXiv Detail & Related papers (2024-01-11T22:31:48Z) - Sparse Training of Discrete Diffusion Models for Graph Generation [45.103518022696996]
We introduce SparseDiff, a novel diffusion model based on the observation that almost all large graphs are sparse.
By selecting a subset of edges, SparseDiff effectively leverages sparse graph representations both during the noising process and within the denoising network.
Our model demonstrates state-of-the-art performance across multiple metrics on both small and large datasets.
arXiv Detail & Related papers (2023-11-03T16:50:26Z) - Graph Laplacian Learning with Exponential Family Noise [8.594140167290098]
We propose a versatile graph inference framework for learning from graph signals corrupted by exponential family noise.
Our framework generalizes previous methods from continuous smooth graph signals to various data types.
arXiv Detail & Related papers (2023-06-14T02:09:52Z) - Stable and Transferable Hyper-Graph Neural Networks [95.07035704188984]
We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
arXiv Detail & Related papers (2022-11-11T23:44:20Z) - Time-varying Graph Representation Learning via Higher-Order Skip-Gram
with Negative Sampling [0.456877715768796]
We build upon the fact that the skip-gram embedding approach implicitly performs a matrix factorization.
We show that higher-order skip-gram with negative sampling is able to disentangle the role of nodes and time.
We empirically evaluate our approach using time-resolved face-to-face proximity data, showing that the learned time-varying graph representations outperform state-of-the-art methods.
arXiv Detail & Related papers (2020-06-25T12:04:48Z) - Offline detection of change-points in the mean for stationary graph
signals [55.98760097296213]
We propose an offline method that relies on the concept of graph signal stationarity.
Our detector comes with a proof of a non-asymptotic inequality oracle.
arXiv Detail & Related papers (2020-06-18T15:51:38Z) - Breaking the waves: asymmetric random periodic features for low-bitrate
kernel machines [13.704881067616995]
We introduce the general framework of asymmetric random periodic features, where the two signals of interest are observed through random periodic features.
We prove uniform probabilistic error bounds holding for all signal pairs from an infinite low-complexity set.
arXiv Detail & Related papers (2020-04-14T14:44:54Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.