Second-Order Tensorial Partial Differential Equations on Graphs
- URL: http://arxiv.org/abs/2509.02015v3
- Date: Tue, 16 Sep 2025 14:41:12 GMT
- Title: Second-Order Tensorial Partial Differential Equations on Graphs
- Authors: Aref Einizade, Fragkiskos D. Malliaros, Jhony H. Giraldo,
- Abstract summary: We introduce second-order tensorial partial differential equations on graphs (SoTPDEG)<n>We propose the first theoretically grounded framework for second-order continuous product graph neural networks (GNNs)
- Score: 13.421159402806675
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Processing data on multiple interacting graphs is crucial for many applications, but existing approaches rely mostly on discrete filtering or first-order continuous models, dampening high frequencies and slow information propagation. In this paper, we introduce second-order tensorial partial differential equations on graphs (SoTPDEG) and propose the first theoretically grounded framework for second-order continuous product graph neural networks (GNNs). Our method exploits the separability of cosine kernels in Cartesian product graphs to enable efficient spectral decomposition while preserving high-frequency components. We further provide rigorous over-smoothing and stability analysis under graph perturbations, establishing a solid theoretical foundation. Experimental results on spatiotemporal traffic forecasting illustrate the superiority over the compared methods.
Related papers
- Graph-Aware Diffusion for Signal Generation [35.631095096228]
We study the problem of generating graph signals from unknown distributions defined over given graphs.<n>Our approach builds on generative diffusion models, which are well established in vision and graph generation.<n>We demonstrate the advantages of GAD on synthetic data, real traffic speed measurements, and a temperature sensor network.
arXiv Detail & Related papers (2025-10-06T17:11:32Z) - On the Convergence and Size Transferability of Continuous-depth Graph Neural Networks [3.3390650922528184]
Continuous-depth graph neural networks (GNNs) combine the structural inductive bias of Graph Neural Networks (GNNs) with the continuous-depth architecture of Graph Neural Differential Equations (ODEs)<n>We present a rigorous convergence analysis of NeuralEs with time parameters in the infinite-node limit, providing theoretical insights into their size transferability.
arXiv Detail & Related papers (2025-10-04T19:59:21Z) - Stochastic Variance-Reduced Iterative Hard Thresholding in Graph Sparsity Optimization [0.626226809683956]
We introduce two methods to solve gradient-based graph sparsity optimization: GraphRG-IHT and GraphSG-IHT.
We provide a general general for theoretical analysis, demonstrating that methods enjoy a gradient-based framework.
arXiv Detail & Related papers (2024-07-24T03:26:26Z) - Continuous Product Graph Neural Networks [5.703629317205571]
Multidomain data defined on multiple graphs holds significant potential in practical applications in computer science.
We introduce Continuous Product Graph Neural Networks (CITRUS) that emerge as a natural solution to the TPDEG.
We evaluate CITRUS on well-known traffic andtemporal weather forecasting datasets, demonstrating superior performance over existing approaches.
arXiv Detail & Related papers (2024-05-29T08:36:09Z) - Revealing Decurve Flows for Generalized Graph Propagation [108.80758541147418]
This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining em textbfgeneralized propagation with directed and weighted graphs.
We include a preliminary exploration of learned propagation patterns in datasets, a first in the field.
arXiv Detail & Related papers (2024-02-13T14:13:17Z) - Fine-tuning Graph Neural Networks by Preserving Graph Generative
Patterns [13.378277755978258]
We show that the structural divergence between pre-training and downstream graphs significantly limits the transferability when using the vanilla fine-tuning strategy.
We propose G-Tuning to preserve the generative patterns of downstream graphs.
G-Tuning demonstrates an average improvement of 0.5% and 2.6% on in-domain and out-of-domain transfer learning experiments.
arXiv Detail & Related papers (2023-12-21T05:17:10Z) - Supercharging Graph Transformers with Advective Diffusion [28.40109111316014]
This paper proposes Advective Diffusion Transformer (AdvDIFFormer), a physics-inspired graph Transformer model designed to address this challenge.<n>We show that AdvDIFFormer has provable capability for controlling generalization error with topological shifts.<n> Empirically, the model demonstrates superiority in various predictive tasks across information networks, molecular screening and protein interactions.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - A continuum limit for the PageRank algorithm [1.2891210250935146]
Semi-supervised and unsupervised machine learning methods often rely on graphs to model data.
We propose a new framework for rigorously studying continuum limits of learning algorithms on directed graphs.
We show how it can be interpreted as a numerical scheme on a directed graph involving a type of normalized graph Laplacian.
arXiv Detail & Related papers (2020-01-24T12:56:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.