Predicting the temporal dynamics of turbulent channels through deep
learning
- URL: http://arxiv.org/abs/2203.00974v1
- Date: Wed, 2 Mar 2022 09:31:03 GMT
- Title: Predicting the temporal dynamics of turbulent channels through deep
learning
- Authors: Giuseppe Borrelli, Luca Guastoni, Hamidreza Eivazi, Philipp Schlatter,
Ricardo Vinuesa
- Abstract summary: We aim to assess the capability of neural networks to reproduce the temporal evolution of a minimal turbulent channel flow.
Long-short-term-memory (LSTM) networks and a Koopman-based framework (KNF) are trained to predict the temporal dynamics of the minimal-channel-flow modes.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The success of recurrent neural networks (RNNs) has been demonstrated in many
applications related to turbulence, including flow control, optimization,
turbulent features reproduction as well as turbulence prediction and modeling.
With this study we aim to assess the capability of these networks to reproduce
the temporal evolution of a minimal turbulent channel flow. We first obtain a
data-driven model based on a modal decomposition in the Fourier domain (which
we denote as FFT-POD) of the time series sampled from the flow. This particular
case of turbulent flow allows us to accurately simulate the most relevant
coherent structures close to the wall. Long-short-term-memory (LSTM) networks
and a Koopman-based framework (KNF) are trained to predict the temporal
dynamics of the minimal-channel-flow modes. Tests with different configurations
highlight the limits of the KNF method compared to the LSTM, given the
complexity of the flow under study. Long-term prediction for LSTM show
excellent agreement from the statistical point of view, with errors below 2%
for the best models with respect to the reference. Furthermore, the analysis of
the chaotic behaviour through the use of the Lyapunov exponents and of the
dynamic behaviour through Poincar\'e maps emphasizes the ability of the LSTM to
reproduce the temporal dynamics of turbulence. Alternative reduced-order models
(ROMs), based on the identification of different turbulent structures, are
explored and they continue to show a good potential in predicting the temporal
dynamics of the minimal channel.
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Expand and Compress: Exploring Tuning Principles for Continual Spatio-Temporal Graph Forecasting [17.530885640317372]
We propose a novel prompt tuning-based continuous forecasting method.
Specifically, we integrate the base-temporal graph neural network with a continuous prompt pool stored in memory.
This method ensures that the model sequentially learns from the widespread-temporal data stream to accomplish tasks for corresponding periods.
arXiv Detail & Related papers (2024-10-16T14:12:11Z) - Dynamical system prediction from sparse observations using deep neural networks with Voronoi tessellation and physics constraint [12.638698799995815]
We introduce the Dynamic System Prediction from Sparse Observations using Voronoi Tessellation (DSOVT) framework.
By integrating Voronoi tessellations with deep learning models, DSOVT is adept at predicting dynamical systems with sparse, unstructured observations.
Compared to purely data-driven models, our physics-based approach enables the model to learn physical laws within explicitly formulated dynamics.
arXiv Detail & Related papers (2024-08-31T13:43:52Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Towards Long-Term predictions of Turbulence using Neural Operators [68.8204255655161]
It aims to develop reduced-order/surrogate models for turbulent flow simulations using Machine Learning.
Different model structures are analyzed, with U-NET structures performing better than the standard FNO in accuracy and stability.
arXiv Detail & Related papers (2023-07-25T14:09:53Z) - A Neural PDE Solver with Temporal Stencil Modeling [44.97241931708181]
Recent Machine Learning (ML) models have shown new promises in capturing important dynamics in high-resolution signals.
This study shows that significant information is often lost in the low-resolution down-sampled features.
We propose a new approach, which combines the strengths of advanced time-series sequence modeling and state-of-the-art neural PDE solvers.
arXiv Detail & Related papers (2023-02-16T06:13:01Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Predicting Physics in Mesh-reduced Space with Temporal Attention [15.054026802351146]
We propose a new method that captures long-term dependencies through a transformer-style temporal attention model.
Our method outperforms a competitive GNN baseline on several complex fluid dynamics prediction tasks.
We believe our approach paves the way to bringing the benefits of attention-based sequence models to solving high-dimensional complex physics tasks.
arXiv Detail & Related papers (2022-01-22T18:32:54Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Recurrent neural networks and Koopman-based frameworks for temporal
predictions in a low-order model of turbulence [1.95992742032823]
We show that it is possible to obtain excellent reproductions of the long-term statistics of a chaotic system with properly trained long-short-term memory networks.
A Koopman-based framework, called Koopman with nonlinear forcing (KNF), leads to the same level of accuracy in the statistics at a significantly lower computational expense.
arXiv Detail & Related papers (2020-05-01T11:05:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.