Learn to Jump: Adaptive Random Walks for Long-Range Propagation through Graph Hierarchies
- URL: http://arxiv.org/abs/2509.01381v1
- Date: Mon, 01 Sep 2025 11:27:53 GMT
- Title: Learn to Jump: Adaptive Random Walks for Long-Range Propagation through Graph Hierarchies
- Authors: Joël Mathys, Federico Errica,
- Abstract summary: We propose a novel approach exploiting hierarchical graph structures and adaptive random walks to address this challenge.<n>Our method introduces learnable transition probabilities that decide whether the walk should prefer the original graph or travel across hierarchical shortcuts.
- Score: 11.079279032331483
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Message-passing architectures struggle to sufficiently model long-range dependencies in node and graph prediction tasks. We propose a novel approach exploiting hierarchical graph structures and adaptive random walks to address this challenge. Our method introduces learnable transition probabilities that decide whether the walk should prefer the original graph or travel across hierarchical shortcuts. On a synthetic long-range task, we demonstrate that our approach can exceed the theoretical bound that constrains traditional approaches operating solely on the original topology. Specifically, walks that prefer the hierarchy achieve the same performance as longer walks on the original graph. These preliminary findings open a promising direction for efficiently processing large graphs while effectively capturing long-range dependencies.
Related papers
- HELP: HyperNode Expansion and Logical Path-Guided Evidence Localization for Accurate and Efficient GraphRAG [53.30561659838455]
Large Language Models (LLMs) often struggle with inherent knowledge boundaries and hallucinations.<n>Retrieval-Augmented Generation (RAG) frequently overlooks structural interdependencies essential for multi-hop reasoning.<n>Help achieves competitive performance across multiple simple and multi-hop QA benchmarks and up to a 28.8$times$ speedup over leading Graph-based RAG baselines.
arXiv Detail & Related papers (2026-02-24T14:05:29Z) - Computationally-efficient Graph Modeling with Refined Graph Random Features [13.71262071974362]
We propose a new class of Graph Random Features (GRFs) for efficient and accurate computations.<n>GRFs++ resolve some of the long-standing limitations of regular GRFs.<n>They reduce dependence on sampling long graph random walks via a novel walk-stitching technique.
arXiv Detail & Related papers (2025-10-09T02:53:26Z) - Skeleton-Guided Learning for Shortest Path Search [31.51778160288742]
Shortest path search is a core operation in graph-based applications.<n>We propose a versatile learning-based framework for shortest path search on generic graphs.
arXiv Detail & Related papers (2025-08-04T10:21:52Z) - On Measuring Long-Range Interactions in Graph Neural Networks [24.974333602585368]
Long-range graph tasks are an open problem in graph neural network research.<n>We introduce a range measure for operators on graphs, and validate it with synthetic experiments.
arXiv Detail & Related papers (2025-06-06T10:48:30Z) - Pave Your Own Path: Graph Gradual Domain Adaptation on Fused Gromov-Wasserstein Geodesics [59.07903030446756]
Graph neural networks are highly vulnerable to distribution shifts on graphs.<n>We present Gadget, the first framework for non-IID graph data.<n> Gadget can be seamlessly integrated with existing graph DA methods to handle large shifts on graphs.
arXiv Detail & Related papers (2025-05-19T05:03:58Z) - Extendable Long-Horizon Planning via Hierarchical Multiscale Diffusion [62.91968752955649]
This paper tackles a novel problem, extendable long-horizon planning-enabling agents to plan trajectories longer than those in training data without compounding errors.<n>We propose an augmentation method that iteratively generates longer trajectories by stitching shorter ones.<n>HM-Diffuser trains on these extended trajectories using a hierarchical structure, efficiently handling tasks across multiple temporal scales.
arXiv Detail & Related papers (2025-03-25T22:52:46Z) - Learning Long Range Dependencies on Graphs via Random Walks [6.7864586321550595]
Message-passing graph neural networks (GNNs) excel at capturing local relationships but struggle with long-range dependencies in graphs.
graph transformers (GTs) enable global information exchange but often oversimplify the graph structure by representing graphs as sets of fixed-length vectors.
This work introduces a novel architecture that overcomes the shortcomings of both approaches by combining the long-range information of random walks with local message passing.
arXiv Detail & Related papers (2024-06-05T15:36:57Z) - Sparse Training of Discrete Diffusion Models for Graph Generation [45.103518022696996]
We introduce SparseDiff, a novel diffusion model based on the observation that almost all large graphs are sparse.
By selecting a subset of edges, SparseDiff effectively leverages sparse graph representations both during the noising process and within the denoising network.
Our model demonstrates state-of-the-art performance across multiple metrics on both small and large datasets.
arXiv Detail & Related papers (2023-11-03T16:50:26Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Persistence Homology for Link Prediction: An Interactive View [15.068319518015421]
Link prediction is an important learning task for graph-structured data.
We propose a novel topological approach to characterize interactions between two nodes.
We also propose a graph neural network method that outperforms state-of-the-arts on different benchmarks.
arXiv Detail & Related papers (2021-02-20T04:33:59Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.