Riemannian Liquid Spatio-Temporal Graph Network
- URL: http://arxiv.org/abs/2601.14115v1
- Date: Tue, 20 Jan 2026 16:09:05 GMT
- Title: Riemannian Liquid Spatio-Temporal Graph Network
- Authors: Liangsi Lu, Jingchao Wang, Zhaorong Dai, Hanqian Liu, Yang Shi,
- Abstract summary: Liquid-Constant networks (LTCs) excel at modeling irregularly-sampled dynamics but are fundamentally confined to Euclidean space.<n>This limitation introduces significant geometric distortion when representing real-world graphs with inherent non-Euclidean structures.<n>We introduce a framework that unifies continuous-time liquid dynamics with inductive inductive geometric biases.<n>RLSTG achieves superior performance on graphs with complex structures.
- Score: 6.583503277841693
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Liquid Time-Constant networks (LTCs), a type of continuous-time graph neural network, excel at modeling irregularly-sampled dynamics but are fundamentally confined to Euclidean space. This limitation introduces significant geometric distortion when representing real-world graphs with inherent non-Euclidean structures (e.g., hierarchies and cycles), degrading representation quality. To overcome this limitation, we introduce the Riemannian Liquid Spatio-Temporal Graph Network (RLSTG), a framework that unifies continuous-time liquid dynamics with the geometric inductive biases of Riemannian manifolds. RLSTG models graph evolution through an Ordinary Differential Equation (ODE) formulated directly on a curved manifold, enabling it to faithfully capture the intrinsic geometry of both structurally static and dynamic spatio-temporal graphs. Moreover, we provide rigorous theoretical guarantees for RLSTG, extending stability theorems of LTCs to the Riemannian domain and quantifying its expressive power via state trajectory analysis. Extensive experiments on real-world benchmarks demonstrate that, by combining advanced temporal dynamics with a Riemannian spatial representation, RLSTG achieves superior performance on graphs with complex structures. Project Page: https://rlstg.github.io
Related papers
- Latent Dynamics Graph Convolutional Networks for model order reduction of parameterized time-dependent PDEs [0.0]
We introduce Latent Dynamics Graph Conal Network (LD-GCN), a purely data-driven, encoder-free architecture.<n>LD-GCN learns a global, low-dimensional representation of dynamical systems conditioned on external inputs and parameters.<n>Our framework enhances interpretability by enabling the analysis of the reduced dynamics and supporting zero-shot prediction.
arXiv Detail & Related papers (2026-01-16T13:10:00Z) - Geometry-Aware Spiking Graph Neural Network [24.920334588995072]
We propose a Geometry-Aware Spiking Graph Neural Network that unifies spike-based neural dynamics with adaptive representation learning.<n>Experiments on multiple benchmarks show that GSG achieves superior accuracy, robustness, and energy efficiency compared to both Euclidean SNNs and manifold-based GNNs.
arXiv Detail & Related papers (2025-08-09T02:52:38Z) - Learning Latent Graph Geometry via Fixed-Point Schrödinger-Type Activation: A Theoretical Study [1.1745324895296467]
We develop a unified theoretical framework for neural architectures with internal representations evolving as stationary states of dissipative Schr"odinger-type dynamics on learned latent graphs.<n>We prove existence, uniqueness, and smooth dependence of equilibria, and show that the dynamics are equivalent under the Bloch map to norm-preserving Landau--Lifshitz flows.<n>The resulting model class provides a compact, geometrically interpretable, and analytically tractable foundation for learning latent graph geometry via fixed-point Schr"odinger-type activations.
arXiv Detail & Related papers (2025-07-27T00:35:15Z) - Learning Dynamic Graphs via Tensorized and Lightweight Graph Convolutional Networks [0.0]
A dynamic graph convolutional network (DGCN) has been successfully applied to perform precise representation learning on a dynamic graph.<n>This study proposes a novelized Lightweight Graph Conal Network (TLGCN) for accurate dynamic graph learning.
arXiv Detail & Related papers (2025-04-22T06:13:32Z) - Can we ease the Injectivity Bottleneck on Lorentzian Manifolds for Graph Neural Networks? [0.0]
Lorentzian Graph Isomorphic Network (LGIN) is a novel HGNN designed for enhanced discrimination within the Lorentzian model.<n>LGIN is the first to adapt principles of powerful, highly discriminative GNN architectures to a Riemannian manifold.
arXiv Detail & Related papers (2025-03-31T18:49:34Z) - State Space Models on Temporal Graphs: A First-Principles Study [30.531930200222423]
Research on deep graph learning has shifted from static graphs to temporal graphs in response to real-world complex systems that exhibit dynamic behaviors.
Sequence models such as RNNs or Transformers have long been the predominant backbone networks for modeling such temporal graphs.
We develop GraphSSM, a graph state space model for modeling the dynamics of temporal graphs.
arXiv Detail & Related papers (2024-06-03T02:56:11Z) - DeepRicci: Self-supervised Graph Structure-Feature Co-Refinement for
Alleviating Over-squashing [72.70197960100677]
Graph Structure Learning (GSL) plays an important role in boosting Graph Neural Networks (GNNs) with a refined graph.
GSL solutions usually focus on structure refinement with task-specific supervision (i.e., node classification) or overlook the inherent weakness of GNNs themselves.
We propose to study self-supervised graph structure-feature co-refinement for effectively alleviating the issue of over-squashing in typical GNNs.
arXiv Detail & Related papers (2024-01-23T14:06:08Z) - Towards Expressive Spectral-Temporal Graph Neural Networks for Time Series Forecasting [101.5022396668152]
Spectral-temporal graph neural network is a promising abstraction underlying most time series forecasting models.<n>We establish a theoretical framework that unravels the expressive power of spectral-temporal GNNs.<n>Our findings pave the way for devising a broader array of provably expressive GNN-based models for time series.
arXiv Detail & Related papers (2023-05-11T05:56:38Z) - A Self-supervised Riemannian GNN with Time Varying Curvature for
Temporal Graph Learning [79.20249985327007]
We present a novel self-supervised Riemannian graph neural network (SelfRGNN)
Specifically, we design a curvature-varying GNN with a theoretically grounded time encoding, and formulate a functional curvature over time to model the evolvement shifting among the positive, zero and negative curvature spaces.
Extensive experiments show the superiority of SelfRGNN, and moreover, the case study shows the time-varying curvature of temporal graph in reality.
arXiv Detail & Related papers (2022-08-30T08:43:06Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Spatio-Temporal Graph Scattering Transform [54.52797775999124]
Graph neural networks may be impractical in some real-world scenarios due to a lack of sufficient high-quality training data.
We put forth a novel mathematically designed framework to analyze-temporal data.
arXiv Detail & Related papers (2020-12-06T19:49:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.