A Numerical Proof of Shell Model Turbulence Closure
- URL: http://arxiv.org/abs/2202.09289v2
- Date: Tue, 25 Jun 2024 09:40:14 GMT
- Title: A Numerical Proof of Shell Model Turbulence Closure
- Authors: Giulio Ortali, Alessandro Corbetta, Gianluigi Rozza, Federico Toschi,
- Abstract summary: We present a closure, based on deep recurrent neural networks, that quantitatively reproduces, within statistical errors, Eulerian and Lagrangian structure functions and the intermittent statistics of the energy cascade.
Our results encourage the development of similar approaches for 3D Navier-Stokes turbulence.
- Score: 41.94295877935867
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The development of turbulence closure models, parametrizing the influence of small non-resolved scales on the dynamics of large resolved ones, is an outstanding theoretical challenge with vast applicative relevance. We present a closure, based on deep recurrent neural networks, that quantitatively reproduces, within statistical errors, Eulerian and Lagrangian structure functions and the intermittent statistics of the energy cascade, including those of subgrid fluxes. To achieve high-order statistical accuracy, and thus a stringent statistical test, we employ shell models of turbulence. Our results encourage the development of similar approaches for 3D Navier-Stokes turbulence.
Related papers
- CoNFiLD-inlet: Synthetic Turbulence Inflow Using Generative Latent Diffusion Models with Neural Fields [7.646019826936172]
Eddy-resolving turbulence simulations require inflow conditions that accurately replicate the complex, multi-scale structures of turbulence.
Traditional recycling-based methods rely on computationally expensive simulations, while existing synthetic inflow generators often fail to reproduce realistic coherent structures of turbulence.
We present CoNFiLD-inlet, a novel DL-based inflow generator that integrates latent space to produce realistic, inflow turbulence.
arXiv Detail & Related papers (2024-11-21T18:13:03Z) - The Risk of Federated Learning to Skew Fine-Tuning Features and
Underperform Out-of-Distribution Robustness [50.52507648690234]
Federated learning has the risk of skewing fine-tuning features and compromising the robustness of the model.
We introduce three robustness indicators and conduct experiments across diverse robust datasets.
Our approach markedly enhances the robustness across diverse scenarios, encompassing various parameter-efficient fine-tuning methods.
arXiv Detail & Related papers (2024-01-25T09:18:51Z) - Synthetic Lagrangian Turbulence by Generative Diffusion Models [1.7810134788247751]
We propose a machine learning approach to generate single-particle trajectories in three-dimensional turbulence at high Reynolds numbers.
Our model demonstrates the ability to reproduce most statistical benchmarks across time scales.
Surprisingly, the model exhibits strong generalizability for extreme events, producing events of higher intensity and rarity that still match the realistic statistics.
arXiv Detail & Related papers (2023-07-17T14:42:32Z) - A unified method of data assimilation and turbulence modeling for
separated flows at high Reynolds numbers [0.0]
In this paper, we propose an improved ensemble kalman inversion method as a unified approach of data assimilation and turbulence modeling.
The trainable parameters of the DNN are optimized according to the given experimental surface pressure coefficients.
The results show that through joint assimilation of vary few experimental states, we can get turbulence models generalizing well to both attached and separated flows.
arXiv Detail & Related papers (2022-11-01T17:17:53Z) - Triangular Flows for Generative Modeling: Statistical Consistency,
Smoothness Classes, and Fast Rates [8.029049649310211]
Triangular flows, also known as Kn"othe-Rosenblatt measure couplings, comprise an important building block of normalizing flow models.
We present statistical guarantees and sample complexity bounds for triangular flow statistical models.
arXiv Detail & Related papers (2021-12-31T18:57:37Z) - Estimation of Bivariate Structural Causal Models by Variational Gaussian
Process Regression Under Likelihoods Parametrised by Normalising Flows [74.85071867225533]
Causal mechanisms can be described by structural causal models.
One major drawback of state-of-the-art artificial intelligence is its lack of explainability.
arXiv Detail & Related papers (2021-09-06T14:52:58Z) - Divergence Frontiers for Generative Models: Sample Complexity,
Quantization Level, and Frontier Integral [58.434753643798224]
Divergence frontiers have been proposed as an evaluation framework for generative models.
We establish non-asymptotic bounds on the sample complexity of the plug-in estimator of divergence frontiers.
We also augment the divergence frontier framework by investigating the statistical performance of smoothed distribution estimators.
arXiv Detail & Related papers (2021-06-15T06:26:25Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - On the stability of projection-based model order reduction for
convection-dominated laminar and turbulent flows [0.0]
It is often claimed that due to modal truncation, a projection-based reduced-order model (PROM) does not resolve the dissipative regime of the turbulent energy cascade and therefore is numerically unstable.
This paper explores the relationship between projection-based model order reduction and semi-discretization and using numerical evidence from three relevant flow problems, it argues in an orderly manner that the real culprit behind most if not all reported numerical instabilities of PROMs for turbulence and convection-dominated turbulent flow problems is the Galerkin framework that has been used for constructing the PROMs.
arXiv Detail & Related papers (2020-01-27T22:39:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.