Data-driven low-dimensional dynamic model of Kolmogorov flow
- URL: http://arxiv.org/abs/2210.16708v2
- Date: Tue, 1 Aug 2023 16:38:44 GMT
- Title: Data-driven low-dimensional dynamic model of Kolmogorov flow
- Authors: Carlos E. P\'erez De Jes\'us, Michael D. Graham
- Abstract summary: Reduced order models (ROMs) that capture flow dynamics are of interest for decreasing computational costs for simulation.
This work presents a data-driven framework for minimal-dimensional models that effectively capture the dynamics and properties of the flow.
We apply this to Kolmogorov flow in a regime consisting of chaotic and intermittent behavior.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reduced order models (ROMs) that capture flow dynamics are of interest for
decreasing computational costs for simulation as well as for model-based
control approaches. This work presents a data-driven framework for
minimal-dimensional models that effectively capture the dynamics and properties
of the flow. We apply this to Kolmogorov flow in a regime consisting of chaotic
and intermittent behavior, which is common in many flows processes and is
challenging to model. The trajectory of the flow travels near relative periodic
orbits (RPOs), interspersed with sporadic bursting events corresponding to
excursions between the regions containing the RPOs. The first step in
development of the models is use of an undercomplete autoencoder to map from
the full state data down to a latent space of dramatically lower dimension.
Then models of the discrete-time evolution of the dynamics in the latent space
are developed. By analyzing the model performance as a function of latent space
dimension we can estimate the minimum number of dimensions required to capture
the system dynamics. To further reduce the dimension of the dynamical model, we
factor out a phase variable in the direction of translational invariance for
the flow, leading to separate evolution equations for the pattern and phase. At
a model dimension of five for the pattern dynamics, as opposed to the full
state dimension of 1024 (i.e. a 32x32 grid), accurate predictions are found for
individual trajectories out to about two Lyapunov times, as well as for
long-time statistics. Further small improvements in the results occur at a
dimension of nine. The nearly heteroclinic connections between the different
RPOs, including the quiescent and bursting time scales, are well captured. We
also capture key features of the phase dynamics. Finally, we use the
low-dimensional representation to predict future bursting events, finding good
success.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Unfolding Time: Generative Modeling for Turbulent Flows in 4D [49.843505326598596]
This work introduces a 4D generative diffusion model and a physics-informed guidance technique that enables the generation of realistic sequences of flow states.
Our findings indicate that the proposed method can successfully sample entire subsequences from the turbulent manifold.
This advancement opens doors for the application of generative modeling in analyzing the temporal evolution of turbulent flows.
arXiv Detail & Related papers (2024-06-17T10:21:01Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - Neural Ideal Large Eddy Simulation: Modeling Turbulence with Neural
Stochastic Differential Equations [22.707574194338132]
We introduce a data-driven learning framework that assimilates two powerful ideas: ideal eddy simulation (LES) from turbulence closure modeling and neural differential equations (SDE) for large modeling.
We show the effectiveness of our approach on a challenging chaotic dynamical system: Kolmogorov flow at a Reynolds number of 20,000.
arXiv Detail & Related papers (2023-06-01T22:16:28Z) - Discovering Dynamic Patterns from Spatiotemporal Data with Time-Varying
Low-Rank Autoregression [12.923271427789267]
We develop a time-reduced-rank vector autoregression model whose coefficient are parameterized by low-rank tensor factorization.
In the temporal context, the complex time-varying system behaviors can be revealed by the temporal modes in the proposed model.
arXiv Detail & Related papers (2022-11-28T15:59:52Z) - Manifold Interpolating Optimal-Transport Flows for Trajectory Inference [64.94020639760026]
We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow)
MIOFlow learns, continuous population dynamics from static snapshot samples taken at sporadic timepoints.
We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
arXiv Detail & Related papers (2022-06-29T22:19:03Z) - Predicting Physics in Mesh-reduced Space with Temporal Attention [15.054026802351146]
We propose a new method that captures long-term dependencies through a transformer-style temporal attention model.
Our method outperforms a competitive GNN baseline on several complex fluid dynamics prediction tasks.
We believe our approach paves the way to bringing the benefits of attention-based sequence models to solving high-dimensional complex physics tasks.
arXiv Detail & Related papers (2022-01-22T18:32:54Z) - Dynamical Deep Generative Latent Modeling of 3D Skeletal Motion [15.359134407309726]
Our model decomposes highly correlated skeleton data into a set of few spatial basis of switching temporal processes.
This results in a dynamical deep generative latent model that parses the meaningful intrinsic states in the dynamics of 3D pose data.
arXiv Detail & Related papers (2021-06-18T23:58:49Z) - Autoregressive Dynamics Models for Offline Policy Evaluation and
Optimization [60.73540999409032]
We show that expressive autoregressive dynamics models generate different dimensions of the next state and reward sequentially conditioned on previous dimensions.
We also show that autoregressive dynamics models are useful for offline policy optimization by serving as a way to enrich the replay buffer.
arXiv Detail & Related papers (2021-04-28T16:48:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.