A Physics-informed Diffusion Model for High-fidelity Flow Field
Reconstruction
- URL: http://arxiv.org/abs/2211.14680v1
- Date: Sat, 26 Nov 2022 23:14:18 GMT
- Title: A Physics-informed Diffusion Model for High-fidelity Flow Field
Reconstruction
- Authors: Dule Shu, Zijie Li, Amir Barati Farimani
- Abstract summary: We propose a diffusion model which only uses high-fidelity data at training.
With different configurations, our model is able to reconstruct high-fidelity data from either a regular low-fidelity sample or a sparsely measured sample.
Our model can produce accurate reconstruction results for 2d turbulent flows based on different input sources without retraining.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Machine learning models are gaining increasing popularity in the domain of
fluid dynamics for their potential to accelerate the production of
high-fidelity computational fluid dynamics data. However, many recently
proposed machine learning models for high-fidelity data reconstruction require
low-fidelity data for model training. Such requirement restrains the
application performance of these models, since their data reconstruction
accuracy would drop significantly if the low-fidelity input data used in model
test has a large deviation from the training data. To overcome this restraint,
we propose a diffusion model which only uses high-fidelity data at training.
With different configurations, our model is able to reconstruct high-fidelity
data from either a regular low-fidelity sample or a sparsely measured sample,
and is also able to gain an accuracy increase by using physics-informed
conditioning information from a known partial differential equation when that
is available. Experimental results demonstrate that our model can produce
accurate reconstruction results for 2d turbulent flows based on different input
sources without retraining.
Related papers
- Deep learning for model correction of dynamical systems with data scarcity [0.0]
We present a deep learning framework for correcting existing dynamical system models utilizing only a scarce high-fidelity data set.
We focus on the case when the amount of high-fidelity data is so small that most of the existing data driven modeling methods cannot be applied.
arXiv Detail & Related papers (2024-10-23T14:33:11Z) - Physics-integrated generative modeling using attentive planar normalizing flow based variational autoencoder [0.0]
We aim to improve the fidelity of reconstruction and to noise in the physics integrated generative model.
To improve the robustness of generative model against noise injected in the model, we propose a modification in the encoder part of the normalizing flow based VAE.
arXiv Detail & Related papers (2024-04-18T15:38:14Z) - PiRD: Physics-informed Residual Diffusion for Flow Field Reconstruction [5.06136344261226]
CNN-based methods for data fidelity enhancement rely on low-fidelity data patterns and distributions during the training phase.
Our proposed model - Physics-informed Residual Diffusion - demonstrates the capability to elevate the quality of data from both standard low-fidelity inputs.
Experimental results have shown that our approach can effectively reconstruct high-quality outcomes for two-dimensional turbulent flows without requiring retraining.
arXiv Detail & Related papers (2024-04-12T11:45:51Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Continual Learning of Diffusion Models with Generative Distillation [34.52513912701778]
Diffusion models are powerful generative models that achieve state-of-the-art performance in image synthesis.
In this paper, we propose generative distillation, an approach that distils the entire reverse process of a diffusion model.
arXiv Detail & Related papers (2023-11-23T14:33:03Z) - On the Stability of Iterative Retraining of Generative Models on their own Data [56.153542044045224]
We study the impact of training generative models on mixed datasets.
We first prove the stability of iterative training under the condition that the initial generative models approximate the data distribution well enough.
We empirically validate our theory on both synthetic and natural images by iteratively training normalizing flows and state-of-the-art diffusion models.
arXiv Detail & Related papers (2023-09-30T16:41:04Z) - A Denoising Diffusion Model for Fluid Field Prediction [0.0]
We propose a novel denoising diffusion generative model for predicting nonlinear fluid fields named FluidDiff.
By performing a diffusion process, the model is able to learn a complex representation of the high-dimensional dynamic system.
Langevin sampling is used to generate predictions for the flow state under specified initial conditions.
arXiv Detail & Related papers (2023-01-27T11:30:40Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Churn Reduction via Distillation [54.5952282395487]
We show an equivalence between training with distillation using the base model as the teacher and training with an explicit constraint on the predictive churn.
We then show that distillation performs strongly for low churn training against a number of recent baselines.
arXiv Detail & Related papers (2021-06-04T18:03:31Z) - Contrastive Model Inversion for Data-Free Knowledge Distillation [60.08025054715192]
We propose Contrastive Model Inversion, where the data diversity is explicitly modeled as an optimizable objective.
Our main observation is that, under the constraint of the same amount of data, higher data diversity usually indicates stronger instance discrimination.
Experiments on CIFAR-10, CIFAR-100, and Tiny-ImageNet demonstrate that CMI achieves significantly superior performance when the generated data are used for knowledge distillation.
arXiv Detail & Related papers (2021-05-18T15:13:00Z) - Hybrid modeling: Applications in real-time diagnosis [64.5040763067757]
We outline a novel hybrid modeling approach that combines machine learning inspired models and physics-based models.
We are using such models for real-time diagnosis applications.
arXiv Detail & Related papers (2020-03-04T00:44:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.