Time Extrapolation with Graph Convolutional Autoencoder and Tensor Train Decomposition
- URL: http://arxiv.org/abs/2511.23037v1
- Date: Fri, 28 Nov 2025 09:59:17 GMT
- Title: Time Extrapolation with Graph Convolutional Autoencoder and Tensor Train Decomposition
- Authors: Yuanhong Chen, Federico Pichi, Zhen Gao, Gianluigi Rozza,
- Abstract summary: We develop a time-consistent reduced-order model for parameterized partial differential equations on unstructured grids.<n>In particular, high-fidelity snapshots are represented as a combination of parametric, spatial, and temporal cores via TT decomposition.<n>We enhance the generalization performance by developing a multi-fidelity two-stages approach in the framework of Deep Operator Networks (DeepONet)<n> Numerical results, including heat-conduction, advection-diffusion and vortex-shedding phenomena, demonstrate great performance in effectively learning the dynamic in the extrapolation regime for complex geometries.
- Score: 9.446359051690292
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph autoencoders have gained attention in nonlinear reduced-order modeling of parameterized partial differential equations defined on unstructured grids. Despite they provide a geometrically consistent way of treating complex domains, applying such architectures to parameterized dynamical systems for temporal prediction beyond the training data, i.e. the extrapolation regime, is still a challenging task due to the simultaneous need of temporal causality and generalizability in the parametric space. In this work, we explore the integration of graph convolutional autoencoders (GCAs) with tensor train (TT) decomposition and Operator Inference (OpInf) to develop a time-consistent reduced-order model. In particular, high-fidelity snapshots are represented as a combination of parametric, spatial, and temporal cores via TT decomposition, while OpInf is used to learn the evolution of the latter. Moreover, we enhance the generalization performance by developing a multi-fidelity two-stages approach in the framework of Deep Operator Networks (DeepONet), treating the spatial and temporal cores as the trunk networks, and the parametric core as the branch network. Numerical results, including heat-conduction, advection-diffusion and vortex-shedding phenomena, demonstrate great performance in effectively learning the dynamic in the extrapolation regime for complex geometries, also in comparison with state-of-the-art approaches e.g. MeshGraphNets.
Related papers
- ACFormer: Mitigating Non-linearity with Auto Convolutional Encoder for Time Series Forecasting [6.27761817493579]
Time series forecasting (TSF) faces challenges in modeling complex intra-channel temporal dependencies and inter-channel correlations.<n>We propose ACFormer, an architecture designed to reconcile the efficiency of linear projections with the non-linear feature-extraction power of convolutions.
arXiv Detail & Related papers (2026-01-28T13:47:54Z) - Latent Dynamics Graph Convolutional Networks for model order reduction of parameterized time-dependent PDEs [0.0]
We introduce Latent Dynamics Graph Conal Network (LD-GCN), a purely data-driven, encoder-free architecture.<n>LD-GCN learns a global, low-dimensional representation of dynamical systems conditioned on external inputs and parameters.<n>Our framework enhances interpretability by enabling the analysis of the reduced dynamics and supporting zero-shot prediction.
arXiv Detail & Related papers (2026-01-16T13:10:00Z) - An Operator-Consistent Graph Neural Network for Learning Diffusion Dynamics on Irregular Meshes [0.0]
Multiphysics interactions such as diffusion, damage, and healing often take place on irregular meshes.<n>We develop an operator-consistent graph neural network (OCGNN-PINN) that approximates PDE evolution under physics-informed constraints.
arXiv Detail & Related papers (2025-12-05T06:58:25Z) - Transformer with Koopman-Enhanced Graph Convolutional Network for Spatiotemporal Dynamics Forecasting [12.301897782320967]
TK-GCN is a two-stage framework that integrates geometry-aware spatial encoding with long-range temporal modeling.<n>We show that TK-GCN consistently delivers superior predictive accuracy across a range of forecast horizons.
arXiv Detail & Related papers (2025-07-05T01:26:03Z) - STRGCN: Capturing Asynchronous Spatio-Temporal Dependencies for Irregular Multivariate Time Series Forecasting [14.156419219696252]
STRGCN captures the complex interdependencies in IMTS by representing them as a fully connected graph.<n>Experiments on four public datasets demonstrate that STRGCN achieves state-of-the-art accuracy, competitive memory usage and training speed.
arXiv Detail & Related papers (2025-05-07T06:41:33Z) - T-Graphormer: Using Transformers for Spatiotemporal Forecasting [2.855856661274715]
T-Graphormer reduces root mean squared error (RMSE) and mean absolute percentage error (MAPE) by up to 20% and 10%.<n>We show the effectiveness of T-Graphormer on real-world traffic prediction benchmark datasets.
arXiv Detail & Related papers (2025-01-22T23:32:29Z) - Detecting Anomalies in Dynamic Graphs via Memory enhanced Normality [39.476378833827184]
Anomaly detection in dynamic graphs presents a significant challenge due to the temporal evolution of graph structures and attributes.
We introduce a novel spatial- temporal memories-enhanced graph autoencoder (STRIPE)
STRIPE significantly outperforms existing methods with 5.8% improvement in AUC scores and 4.62X faster in training time.
arXiv Detail & Related papers (2024-03-14T02:26:10Z) - Dynamic Graph Representation Learning via Edge Temporal States Modeling and Structure-reinforced Transformer [5.093187534912688]
We introduce the Recurrent Structure-reinforced Graph Transformer (RSGT), a novel framework for dynamic graph representation learning.
RSGT captures temporal node representations encoding both graph topology and evolving dynamics through a recurrent learning paradigm.
We show RSGT's superior performance in discrete dynamic graph representation learning, consistently outperforming existing methods in dynamic link prediction tasks.
arXiv Detail & Related papers (2023-04-20T04:12:50Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Scalable Spatiotemporal Graph Neural Networks [14.415967477487692]
Graph neural networks (GNNs) are often the core component of the forecasting architecture.
In most pretemporal GNNs, the computational complexity scales up to a quadratic factor with the length of the sequence times the number of links in the graph.
We propose a scalable architecture that exploits an efficient encoding of both temporal and spatial dynamics.
arXiv Detail & Related papers (2022-09-14T09:47:38Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Spatio-Temporal Graph Scattering Transform [54.52797775999124]
Graph neural networks may be impractical in some real-world scenarios due to a lack of sufficient high-quality training data.
We put forth a novel mathematically designed framework to analyze-temporal data.
arXiv Detail & Related papers (2020-12-06T19:49:55Z) - Continuous-in-Depth Neural Networks [107.47887213490134]
We first show that ResNets fail to be meaningful dynamical in this richer sense.
We then demonstrate that neural network models can learn to represent continuous dynamical systems.
We introduce ContinuousNet as a continuous-in-depth generalization of ResNet architectures.
arXiv Detail & Related papers (2020-08-05T22:54:09Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.