Single-snapshot machine learning for super-resolution of turbulence
- URL: http://arxiv.org/abs/2409.04923v2
- Date: Sat, 23 Nov 2024 00:43:06 GMT
- Title: Single-snapshot machine learning for super-resolution of turbulence
- Authors: Kai Fukami, Kunihiko Taira,
- Abstract summary: nonlinear machine-learning techniques can effectively extract physical insights from as little as a single snapshot of turbulent flow.
We show that a machine-learning model trained with flow tiles sampled from only a single snapshot can reconstruct vortical structures across a range of Reynolds numbers.
This work hopes to stop machine-learning practitioners from being wasteful with turbulent flow data.
- Score: 0.0
- License:
- Abstract: Modern machine-learning techniques are generally considered data-hungry. However, this may not be the case for turbulence as each of its snapshots can hold more information than a single data file in general machine-learning settings. This study asks the question of whether nonlinear machine-learning techniques can effectively extract physical insights even from as little as a {\it single} snapshot of turbulent flow. As an example, we consider machine-learning-based super-resolution analysis that reconstructs a high-resolution field from low-resolution data for two examples of two-dimensional isotropic turbulence and three-dimensional turbulent channel flow. First, we reveal that a carefully designed machine-learning model trained with flow tiles sampled from only a single snapshot can reconstruct vortical structures across a range of Reynolds numbers for two-dimensional decaying turbulence. Successful flow reconstruction indicates that nonlinear machine-learning techniques can leverage scale-invariance properties to learn turbulent flows. We also show that training data of turbulent flows can be cleverly collected from a single snapshot by considering characteristics of rotation and shear tensors. Second, we perform the single-snapshot super-resolution analysis for turbulent channel flow, showing that it is possible to extract physical insights from a single flow snapshot even with inhomogeneity. The present findings suggest that embedding prior knowledge in designing a model and collecting data is important for a range of data-driven analyses for turbulent flows. More broadly, this work hopes to stop machine-learning practitioners from being wasteful with turbulent flow data.
Related papers
- Deep Learning Through A Telescoping Lens: A Simple Model Provides Empirical Insights On Grokking, Gradient Boosting & Beyond [61.18736646013446]
In pursuit of a deeper understanding of its surprising behaviors, we investigate the utility of a simple yet accurate model of a trained neural network.
Across three case studies, we illustrate how it can be applied to derive new empirical insights on a diverse range of prominent phenomena.
arXiv Detail & Related papers (2024-10-31T22:54:34Z) - Unfolding Time: Generative Modeling for Turbulent Flows in 4D [49.843505326598596]
This work introduces a 4D generative diffusion model and a physics-informed guidance technique that enables the generation of realistic sequences of flow states.
Our findings indicate that the proposed method can successfully sample entire subsequences from the turbulent manifold.
This advancement opens doors for the application of generative modeling in analyzing the temporal evolution of turbulent flows.
arXiv Detail & Related papers (2024-06-17T10:21:01Z) - Turbulence Scaling from Deep Learning Diffusion Generative Models [0.8192907805418583]
We employ a diffusion-based generative model to learn the distribution of turbulent vorticity profiles.
We generate snapshots of turbulent solutions to the incompressible Navier-Stokes equations.
All the learnt scaling exponents are consistent with the expected Kolmogorov scaling.
arXiv Detail & Related papers (2023-11-10T15:27:07Z) - In-Context Convergence of Transformers [63.04956160537308]
We study the learning dynamics of a one-layer transformer with softmax attention trained via gradient descent.
For data with imbalanced features, we show that the learning dynamics take a stage-wise convergence process.
arXiv Detail & Related papers (2023-10-08T17:55:33Z) - Multiscale Flow for Robust and Optimal Cosmological Analysis [7.977229957867868]
Multiscale Flow is a generative Normalizing Flow that creates samples and models the field-level likelihood of two-dimensional cosmological data.
We show that Multiscale Flow is able to identify distribution shifts not in the training data such as baryonic effects.
arXiv Detail & Related papers (2023-06-07T18:00:06Z) - From Zero to Turbulence: Generative Modeling for 3D Flow Simulation [45.626346087828765]
We propose to approach turbulent flow simulation as a generative task directly learning the manifold of all possible turbulent flow states without relying on any initial flow state.
Our generative model captures the distribution of turbulent flows caused by unseen objects and generates high-quality, realistic samples for downstream applications.
arXiv Detail & Related papers (2023-05-29T18:20:28Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Inferring Turbulent Parameters via Machine Learning [0.0]
We design a machine learning technique to solve the general problem of inferring physical parameters from the observation of turbulent flows.
Our approach is to train the machine learning system to regress the rotation frequency of the flow's reference frame.
This study shows interesting results from two different points of view.
arXiv Detail & Related papers (2022-01-03T16:08:48Z) - Generative Flows with Invertible Attentions [135.23766216657745]
We introduce two types of invertible attention mechanisms for generative flow models.
We exploit split-based attention mechanisms to learn the attention weights and input representations on every two splits of flow feature maps.
Our method provides invertible attention modules with tractable Jacobian determinants, enabling seamless integration of it at any positions of the flow-based models.
arXiv Detail & Related papers (2021-06-07T20:43:04Z) - Machine learning for rapid discovery of laminar flow channel wall
modifications that enhance heat transfer [56.34005280792013]
We present a combination of accurate numerical simulations of arbitrary, flat, and non-flat channels and machine learning models predicting drag coefficient and Stanton number.
We show that convolutional neural networks (CNN) can accurately predict the target properties at a fraction of the time of numerical simulations.
arXiv Detail & Related papers (2021-01-19T16:14:02Z) - Reconstruction of turbulent data with deep generative models for
semantic inpainting from TURB-Rot database [0.0]
We study the applicability of tools developed by the computer vision community for features learning and semantic image inpainting to perform data reconstruction of fluid turbulence configurations.
We investigate the capability of Convolutional Neural Networks embedded in a Deep Generative Adversarial Model (Deep-GAN) to generate missing data in turbulence.
arXiv Detail & Related papers (2020-06-16T14:26:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.