A continuum limit for the PageRank algorithm
- URL: http://arxiv.org/abs/2001.08973v3
- Date: Sun, 10 Jan 2021 15:05:01 GMT
- Title: A continuum limit for the PageRank algorithm
- Authors: Amber Yuan, Jeff Calder, Braxton Osting
- Abstract summary: Semi-supervised and unsupervised machine learning methods often rely on graphs to model data.
We propose a new framework for rigorously studying continuum limits of learning algorithms on directed graphs.
We show how it can be interpreted as a numerical scheme on a directed graph involving a type of normalized graph Laplacian.
- Score: 1.2891210250935146
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Semi-supervised and unsupervised machine learning methods often rely on
graphs to model data, prompting research on how theoretical properties of
operators on graphs are leveraged in learning problems. While most of the
existing literature focuses on undirected graphs, directed graphs are very
important in practice, giving models for physical, biological, or
transportation networks, among many other applications. In this paper, we
propose a new framework for rigorously studying continuum limits of learning
algorithms on directed graphs. We use the new framework to study the PageRank
algorithm, and show how it can be interpreted as a numerical scheme on a
directed graph involving a type of normalized graph Laplacian. We show that the
corresponding continuum limit problem, which is taken as the number of webpages
grows to infinity, is a second-order, possibly degenerate, elliptic equation
that contains reaction, diffusion, and advection terms. We prove that the
numerical scheme is consistent and stable and compute explicit rates of
convergence of the discrete solution to the solution of the continuum limit
PDE. We give applications to proving stability and asymptotic regularity of the
PageRank vector. Finally, we illustrate our results with numerical experiments
and explore an application to data depth.
Related papers
- Continuous Product Graph Neural Networks [5.703629317205571]
Multidomain data defined on multiple graphs holds significant potential in practical applications in computer science.
We introduce Continuous Product Graph Neural Networks (CITRUS) that emerge as a natural solution to the TPDEG.
We evaluate CITRUS on well-known traffic and weather forecasting datasets, demonstrating superior performance over existing approaches.
arXiv Detail & Related papers (2024-05-29T08:36:09Z) - Overcoming Order in Autoregressive Graph Generation [12.351817671944515]
Graph generation is a fundamental problem in various domains, including chemistry and social networks.
Recent work has shown that molecular graph generation using recurrent neural networks (RNNs) is advantageous compared to traditional generative approaches.
arXiv Detail & Related papers (2024-02-04T09:58:22Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Graphon based Clustering and Testing of Networks: Algorithms and Theory [11.3700474413248]
Network-valued data are encountered in a wide range of applications and pose challenges in learning.
We present two clustering algorithms that achieve state-of-the-art results.
We further study the applicability of the proposed distance for graph two-sample testing problems.
arXiv Detail & Related papers (2021-10-06T13:14:44Z) - Learning Graphs from Smooth Signals under Moment Uncertainty [23.868075779606425]
We consider the problem of inferring the graph structure from a given set of graph signals.
Traditional graph learning models do not take this distributional uncertainty into account.
arXiv Detail & Related papers (2021-05-12T06:47:34Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Graph Ordering: Towards the Optimal by Learning [69.72656588714155]
Graph representation learning has achieved a remarkable success in many graph-based applications, such as node classification, prediction, and community detection.
However, for some kind of graph applications, such as graph compression and edge partition, it is very hard to reduce them to some graph representation learning tasks.
In this paper, we propose to attack the graph ordering problem behind such applications by a novel learning approach.
arXiv Detail & Related papers (2020-01-18T09:14:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.