Towards Long-Term predictions of Turbulence using Neural Operators
- URL: http://arxiv.org/abs/2307.13517v1
- Date: Tue, 25 Jul 2023 14:09:53 GMT
- Title: Towards Long-Term predictions of Turbulence using Neural Operators
- Authors: Fernando Gonzalez, Fran\c{c}ois-Xavier Demoulin, Simon Bernard
- Abstract summary: It aims to develop reduced-order/surrogate models for turbulent flow simulations using Machine Learning.
Different model structures are analyzed, with U-NET structures performing better than the standard FNO in accuracy and stability.
- Score: 68.8204255655161
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper explores Neural Operators to predict turbulent flows, focusing on
the Fourier Neural Operator (FNO) model. It aims to develop
reduced-order/surrogate models for turbulent flow simulations using Machine
Learning. Different model configurations are analyzed, with U-NET structures
(UNO and U-FNET) performing better than the standard FNO in accuracy and
stability. U-FNET excels in predicting turbulence at higher Reynolds numbers.
Regularization terms, like gradient and stability losses, are essential for
stable and accurate predictions. The study emphasizes the need for improved
metrics for deep learning models in fluid flow prediction. Further research
should focus on models handling complex flows and practical benchmarking
metrics.
Related papers
- From Reactive to Proactive Volatility Modeling with Hemisphere Neural Networks [0.0]
We reinvigorate maximum likelihood estimation (MLE) for macroeconomic density forecasting through a novel neural network architecture with dedicated mean and variance hemispheres.
Our Hemisphere Neural Network (HNN) provides proactive volatility forecasts based on leading indicators when it can, and reactive volatility based on the magnitude of previous prediction errors when it must.
arXiv Detail & Related papers (2023-11-27T21:37:50Z) - Differential Evolution Algorithm based Hyper-Parameters Selection of
Transformer Neural Network Model for Load Forecasting [0.0]
Transformer models have the potential to improve Load forecasting because of their ability to learn long-range dependencies derived from their Attention Mechanism.
Our work compares the proposed Transformer based Neural Network model integrated with different metaheuristic algorithms by their performance in Load forecasting based on numerical metrics such as Mean Squared Error (MSE) and Mean Absolute Percentage Error (MAPE)
arXiv Detail & Related papers (2023-07-28T04:29:53Z) - Forecasting subcritical cylinder wakes with Fourier Neural Operators [58.68996255635669]
We apply a state-of-the-art operator learning technique to forecast the temporal evolution of experimentally measured velocity fields.
We find that FNOs are capable of accurately predicting the evolution of experimental velocity fields throughout the range of Reynolds numbers tested.
arXiv Detail & Related papers (2023-01-19T20:04:36Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Towards prediction of turbulent flows at high Reynolds numbers using
high performance computing data and deep learning [0.39146761527401425]
Various generative adversarial networks (GANs) are discussed with respect to their suitability for understanding and modeling turbulence.
Wasserstein GANs (WGANs) are then chosen to generate small-scale turbulence.
DNS turbulent data is used for training the WGANs and the effect of network parameters, such as learning rate and loss function, is studied.
arXiv Detail & Related papers (2022-10-28T13:14:06Z) - Evaluating the Adversarial Robustness for Fourier Neural Operators [78.36413169647408]
Fourier Neural Operator (FNO) was the first to simulate turbulent flow with zero-shot super-resolution.
We generate adversarial examples for FNO based on norm-bounded data input perturbations.
Our results show that the model's robustness degrades rapidly with increasing perturbation levels.
arXiv Detail & Related papers (2022-04-08T19:19:42Z) - Predicting the temporal dynamics of turbulent channels through deep
learning [0.0]
We aim to assess the capability of neural networks to reproduce the temporal evolution of a minimal turbulent channel flow.
Long-short-term-memory (LSTM) networks and a Koopman-based framework (KNF) are trained to predict the temporal dynamics of the minimal-channel-flow modes.
arXiv Detail & Related papers (2022-03-02T09:31:03Z) - Deep Learning to advance the Eigenspace Perturbation Method for
Turbulence Model Uncertainty Quantification [0.0]
We outline a machine learning approach to aid the use of the Eigenspace Perturbation Method to predict the uncertainty in the turbulence model prediction.
We use a trained neural network to predict the discrepancy in the shape of the RANS predicted Reynolds stress ellipsoid.
arXiv Detail & Related papers (2022-02-11T08:06:52Z) - Sparse Flows: Pruning Continuous-depth Models [107.98191032466544]
We show that pruning improves generalization for neural ODEs in generative modeling.
We also show that pruning finds minimal and efficient neural ODE representations with up to 98% less parameters compared to the original network, without loss of accuracy.
arXiv Detail & Related papers (2021-06-24T01:40:17Z) - Robust Implicit Networks via Non-Euclidean Contractions [63.91638306025768]
Implicit neural networks show improved accuracy and significant reduction in memory consumption.
They can suffer from ill-posedness and convergence instability.
This paper provides a new framework to design well-posed and robust implicit neural networks.
arXiv Detail & Related papers (2021-06-06T18:05:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.