Data-driven Modeling of Parameterized Nonlinear Fluid Dynamical Systems with a Dynamics-embedded Conditional Generative Adversarial Network
- URL: http://arxiv.org/abs/2412.17978v1
- Date: Mon, 23 Dec 2024 20:50:20 GMT
- Title: Data-driven Modeling of Parameterized Nonlinear Fluid Dynamical Systems with a Dynamics-embedded Conditional Generative Adversarial Network
- Authors: Abdolvahhab Rostamijavanani, Shanwu Li, Yongchao Yang,
- Abstract summary: We present a data-driven solution to accurately predict parameterized nonlinear fluid dynamical systems using a dynamics-generator conditional GAN (Dyn-cGAN) as a surrogate model.
The learned Dyn-cGAN model takes into account the system parameters to predict the flow fields of the system accurately.
- Score: 0.0
- License:
- Abstract: This work presents a data-driven solution to accurately predict parameterized nonlinear fluid dynamical systems using a dynamics-generator conditional GAN (Dyn-cGAN) as a surrogate model. The Dyn-cGAN includes a dynamics block within a modified conditional GAN, enabling the simultaneous identification of temporal dynamics and their dependence on system parameters. The learned Dyn-cGAN model takes into account the system parameters to predict the flow fields of the system accurately. We evaluate the effectiveness and limitations of the developed Dyn-cGAN through numerical studies of various parameterized nonlinear fluid dynamical systems, including flow over a cylinder and a 2-D cavity problem, with different Reynolds numbers. Furthermore, we examine how Reynolds number affects the accuracy of the predictions for both case studies. Additionally, we investigate the impact of the number of time steps involved in the process of dynamics block training on the accuracy of predictions, and we find that an optimal value exists based on errors and mutual information relative to the ground truth.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Deep Generative Modeling for Identification of Noisy, Non-Stationary Dynamical Systems [3.1484174280822845]
We focus on finding parsimonious ordinary differential equation (ODE) models for nonlinear, noisy, and non-autonomous dynamical systems.
Our method, dynamic SINDy, combines variational inference with SINDy (sparse identification of nonlinear dynamics) to model time-varying coefficients of sparse ODEs.
arXiv Detail & Related papers (2024-10-02T23:00:00Z) - Towards Model Discovery Using Domain Decomposition and PINNs [44.99833362998488]
The study evaluates the performance of two approaches, namely Physics-Informed Neural Networks (PINNs) and Finite Basis Physics-Informed Neural Networks (FBPINNs)
We find a better performance for the FBPINN approach compared to the vanilla PINN approach, even in cases with data from only a quasi-stationary time domain with few dynamics.
arXiv Detail & Related papers (2024-10-02T14:38:37Z) - Data-Driven Characterization of Latent Dynamics on Quantum Testbeds [0.23408308015481663]
We augment the dynamical equation of quantum systems described by the Lindblad master equation with a parameterized source term.
We consider a structure preserving augmentation that learns and distinguishes unitary from dissipative latent dynamics parameterized by a basis of linear operators.
We demonstrate that our interpretable, structure preserving, and nonlinear models are able to improve the prediction accuracy of the Lindblad master equation.
arXiv Detail & Related papers (2024-01-18T09:28:44Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Reduced order modeling of parametrized systems through autoencoders and
SINDy approach: continuation of periodic solutions [0.0]
This work presents a data-driven, non-intrusive framework which combines ROM construction with reduced dynamics identification.
The proposed approach leverages autoencoder neural networks with parametric sparse identification of nonlinear dynamics (SINDy) to construct a low-dimensional dynamical model.
These aim at tracking the evolution of periodic steady-state responses as functions of system parameters, avoiding the computation of the transient phase, and allowing to detect instabilities and bifurcations.
arXiv Detail & Related papers (2022-11-13T01:57:18Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Continuous Forecasting via Neural Eigen Decomposition of Stochastic
Dynamics [47.82509795873254]
We introduce the Neural Eigen-SDE (NESDE) algorithm for sequential prediction with sparse observations and adaptive dynamics.
NESDE applies eigen-decomposition to the dynamics model to allow efficient frequent predictions given sparse observations.
We are the first to provide a patient-adapted prediction for blood coagulation following Heparin dosing in the MIMIC-IV dataset.
arXiv Detail & Related papers (2022-01-31T22:16:50Z) - Disentangled Generative Models for Robust Prediction of System Dynamics [2.6424064030995957]
In this work, we treat the domain parameters of dynamical systems as factors of variation of the data generating process.
By leveraging ideas from supervised disentanglement and causal factorization, we aim to separate the domain parameters from the dynamics in the latent space of generative models.
Results indicate that disentangled VAEs adapt better to domain parameters spaces that were not present in the training data.
arXiv Detail & Related papers (2021-08-26T09:58:06Z) - Coarse-Grained Nonlinear System Identification [0.0]
We introduce Coarse-Grained Dynamics, an efficient and universal parameterization of nonlinear system dynamics based on the Volterra series expansion.
We demonstrate the properties of this approach on a simple synthetic problem.
We also demonstrate this approach experimentally, showing that it identifies an accurate model of the nonlinear voltage to dynamics of a tungsten filament with less than a second of experimental data.
arXiv Detail & Related papers (2020-10-14T06:45:51Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.