Efficient Real-Time Adaptation of ROMs for Unsteady Flows Using Data Assimilation
- URL: http://arxiv.org/abs/2602.23188v1
- Date: Thu, 26 Feb 2026 16:43:28 GMT
- Title: Efficient Real-Time Adaptation of ROMs for Unsteady Flows Using Data Assimilation
- Authors: Ismaël Zighed, Andrea Nóvoa, Luca Magri, Taraneh Sayadi,
- Abstract summary: We propose an efficient retraining strategy for a parameterized Reduced Order Model (ROM)<n>The strategy attains accuracy comparable to full retraining while requiring only a fraction of the computational time.<n>We show that, for the dynamical system considered, the dominant source of error in out-of-sample forecasts stems from distortions of the latent manifold.
- Score: 7.958594167693376
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We propose an efficient retraining strategy for a parameterized Reduced Order Model (ROM) that attains accuracy comparable to full retraining while requiring only a fraction of the computational time and relying solely on sparse observations of the full system. The architecture employs an encode-process-decode structure: a Variational Autoencoder (VAE) to perform dimensionality reduction, and a transformer network to evolve the latent states and model the dynamics. The ROM is parameterized by an external control variable, the Reynolds number in the Navier-Stokes setting, with the transformer exploiting attention mechanisms to capture both temporal dependencies and parameter effects. The probabilistic VAE enables stochastic sampling of trajectory ensembles, providing predictive means and uncertainty quantification through the first two moments. After initial training on a limited set of dynamical regimes, the model is adapted to out-of-sample parameter regions using only sparse data. Its probabilistic formulation naturally supports ensemble generation, which we employ within an ensemble Kalman filtering framework to assimilate data and reconstruct full-state trajectories from minimal observations. We further show that, for the dynamical system considered, the dominant source of error in out-of-sample forecasts stems from distortions of the latent manifold rather than changes in the latent dynamics. Consequently, retraining can be limited to the autoencoder, allowing for a lightweight, computationally efficient, real-time adaptation procedure with very sparse fine-tuning data.
Related papers
- Is Flow Matching Just Trajectory Replay for Sequential Data? [46.770624059457724]
Flow matching (FM) is increasingly used for time-series generation.<n>It is not well understood whether it learns a general dynamical structure or simply performs an effective "trajectory replay"<n>We show that the implied sampler is an ODE whose dynamics constitutes a nonparametric, memory-augmented continuous-time dynamical system.
arXiv Detail & Related papers (2026-02-09T06:48:45Z) - Low-Dimensional Adaptation of Rectified Flow: A New Perspective through the Lens of Diffusion and Stochastic Localization [59.04314685837778]
Rectified flow (RF) has gained considerable popularity due to its generation efficiency and state-of-the-art performance.<n>In this paper, we investigate the degree to which RF automatically adapts to the intrinsic low dimensionality of the support of the target distribution to accelerate sampling.<n>We show that, using a carefully designed choice of the time-discretization scheme and with sufficiently accurate drift estimates, the RF sampler enjoys an complexity of order $O(k/varepsilon)$.
arXiv Detail & Related papers (2026-01-21T22:09:27Z) - A Simple Approximate Bayesian Inference Neural Surrogate for Stochastic Petri Net Models [0.0]
We introduce a neural-network-based approximation of the posterior distribution framework.<n>Our model employs a lightweight 1D Convolutional Residual Network trained end-to-end on Gillespie-simulated SPN realizations.<n>On synthetic SPNs with 20% missing events, our surrogate recovers rate-function coefficients with an RMSE = 0.108 and substantially runs faster than traditional Bayesian approaches.
arXiv Detail & Related papers (2025-07-14T18:31:19Z) - Efficient Transformed Gaussian Process State-Space Models for Non-Stationary High-Dimensional Dynamical Systems [49.819436680336786]
We propose an efficient transformed Gaussian process state-space model (ETGPSSM) for scalable and flexible modeling of high-dimensional, non-stationary dynamical systems.<n>Specifically, our ETGPSSM integrates a single shared GP with input-dependent normalizing flows, yielding an expressive implicit process prior that captures complex, non-stationary transition dynamics.<n>Our ETGPSSM outperforms existing GPSSMs and neural network-based SSMs in terms of computational efficiency and accuracy.
arXiv Detail & Related papers (2025-03-24T03:19:45Z) - Reduced Order Modeling with Shallow Recurrent Decoder Networks [5.686433280542813]
SHRED-ROM is a robust decoding-only strategy that encodes the numerically unstable approximation of an inverse.<n>We show that SHRED-ROM accurately reconstructs the state dynamics for new parameter values starting from limited fixed or mobile sensors.
arXiv Detail & Related papers (2025-02-15T23:41:31Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Kalman Filter for Online Classification of Non-Stationary Data [101.26838049872651]
In Online Continual Learning (OCL) a learning system receives a stream of data and sequentially performs prediction and training steps.
We introduce a probabilistic Bayesian online learning model by using a neural representation and a state space model over the linear predictor weights.
In experiments in multi-class classification we demonstrate the predictive ability of the model and its flexibility to capture non-stationarity.
arXiv Detail & Related papers (2023-06-14T11:41:42Z) - Reduced order modeling of parametrized systems through autoencoders and
SINDy approach: continuation of periodic solutions [0.0]
This work presents a data-driven, non-intrusive framework which combines ROM construction with reduced dynamics identification.
The proposed approach leverages autoencoder neural networks with parametric sparse identification of nonlinear dynamics (SINDy) to construct a low-dimensional dynamical model.
These aim at tracking the evolution of periodic steady-state responses as functions of system parameters, avoiding the computation of the transient phase, and allowing to detect instabilities and bifurcations.
arXiv Detail & Related papers (2022-11-13T01:57:18Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.