Generative AI for fast and accurate Statistical Computation of Fluids
- URL: http://arxiv.org/abs/2409.18359v1
- Date: Fri, 27 Sep 2024 00:26:18 GMT
- Title: Generative AI for fast and accurate Statistical Computation of Fluids
- Authors: Roberto Molinaro, Samuel Lanthaler, Bogdan Raonić, Tobias Rohner, Victor Armegioiu, Zhong Yi Wan, Fei Sha, Siddhartha Mishra, Leonardo Zepeda-Núñez,
- Abstract summary: We present a generative AI algorithm for addressing the challenging task of fast, accurate and robust statistical computation.
Our algorithm, termed as GenCFD, is based on a conditional score-based diffusion model.
- Score: 21.820160898966055
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a generative AI algorithm for addressing the challenging task of fast, accurate and robust statistical computation of three-dimensional turbulent fluid flows. Our algorithm, termed as GenCFD, is based on a conditional score-based diffusion model. Through extensive numerical experimentation with both incompressible and compressible fluid flows, we demonstrate that GenCFD provides very accurate approximation of statistical quantities of interest such as mean, variance, point pdfs, higher-order moments, while also generating high quality realistic samples of turbulent fluid flows and ensuring excellent spectral resolution. In contrast, ensembles of operator learning baselines which are trained to minimize mean (absolute) square errors regress to the mean flow. We present rigorous theoretical results uncovering the surprising mechanisms through which diffusion models accurately generate fluid flows. These mechanisms are illustrated with solvable toy models that exhibit the relevant features of turbulent fluid flows while being amenable to explicit analytical formulas.
Related papers
- DiffFluid: Plain Diffusion Models are Effective Predictors of Flow Dynamics [16.660107496540146]
We showcase the plain diffusion models with Transformers as effective predictors of fluid dynamics under various working conditions.
Our approach formulates the prediction of flow dynamics as the image translation problem and accordingly leverage the plain diffusion model to tackle the problem.
arXiv Detail & Related papers (2024-09-20T17:19:03Z) - Inpainting Computational Fluid Dynamics with Deep Learning [8.397730500554047]
An effective fluid data completion method reduces the required number of sensors in a fluid dynamics experiment.
The ill-posed nature of the fluid data completion problem makes it prohibitively difficult to obtain a theoretical solution.
We employ the vector quantization technique to map both complete and incomplete fluid data spaces onto discrete-valued lower-dimensional representations.
arXiv Detail & Related papers (2024-02-27T03:44:55Z) - Uncertainty-aware Surrogate Models for Airfoil Flow Simulations with Denoising Diffusion Probabilistic Models [26.178192913986344]
We make a first attempt to use denoising diffusion probabilistic models (DDPMs) to train an uncertainty-aware surrogate model for turbulence simulations.
Our results show DDPMs can successfully capture the whole distribution of solutions and, as a consequence, accurately estimate the uncertainty of the simulations.
We also evaluate an emerging generative modeling variant, flow matching, in comparison to regular diffusion models.
arXiv Detail & Related papers (2023-12-08T19:04:17Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Benchmarking Autoregressive Conditional Diffusion Models for Turbulent
Flow Simulation [29.806100463356906]
We analyze if fully data-driven fluid solvers that utilize an autoregressive rollout based on conditional diffusion models are a viable option.
We investigate accuracy, posterior sampling, spectral behavior, and temporal stability, while requiring that methods generalize to flow parameters beyond the training regime.
We find that even simple diffusion-based approaches can outperform multiple established flow prediction methods in terms of accuracy and temporal stability, while being on par with state-of-the-art stabilization techniques like unrolling at training time.
arXiv Detail & Related papers (2023-09-04T18:01:42Z) - Eliminating Lipschitz Singularities in Diffusion Models [51.806899946775076]
We show that diffusion models frequently exhibit the infinite Lipschitz near the zero point of timesteps.
This poses a threat to the stability and accuracy of the diffusion process, which relies on integral operations.
We propose a novel approach, dubbed E-TSDM, which eliminates the Lipschitz of the diffusion model near zero.
arXiv Detail & Related papers (2023-06-20T03:05:28Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Diffusion Models are Minimax Optimal Distribution Estimators [49.47503258639454]
We provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling.
We show that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates.
arXiv Detail & Related papers (2023-03-03T11:31:55Z) - A Denoising Diffusion Model for Fluid Field Prediction [0.0]
We propose a novel denoising diffusion generative model for predicting nonlinear fluid fields named FluidDiff.
By performing a diffusion process, the model is able to learn a complex representation of the high-dimensional dynamic system.
Langevin sampling is used to generate predictions for the flow state under specified initial conditions.
arXiv Detail & Related papers (2023-01-27T11:30:40Z) - How Much is Enough? A Study on Diffusion Times in Score-based Generative
Models [76.76860707897413]
Current best practice advocates for a large T to ensure that the forward dynamics brings the diffusion sufficiently close to a known and simple noise distribution.
We show how an auxiliary model can be used to bridge the gap between the ideal and the simulated forward dynamics, followed by a standard reverse diffusion process.
arXiv Detail & Related papers (2022-06-10T15:09:46Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.