Deep Surrogate for Direct Time Fluid Dynamics
- URL: http://arxiv.org/abs/2112.10296v1
- Date: Thu, 16 Dec 2021 10:08:20 GMT
- Title: Deep Surrogate for Direct Time Fluid Dynamics
- Authors: Lucas Meyer (UGA, LIG, EDF R&D, Grenoble INP, DATAMOVE ), Louen
Pottier (ENS Paris Saclay, EDF R&D), Alejandro Ribes (EDF R&D), Bruno Raffin
(Grenoble INP, LIG, DATAMOVE, UGA)
- Abstract summary: Graph Neural Networks (GNN) can address the specificity of the irregular meshes commonly used in CFD simulations.
We present our ongoing work to design a novel direct time GNN architecture for irregular meshes.
- Score: 44.62475518267084
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The ubiquity of fluids in the physical world explains the need to accurately
simulate their dynamics for many scientific and engineering applications.
Traditionally, well established but resource intensive CFD solvers provide such
simulations. The recent years have seen a surge of deep learning surrogate
models substituting these solvers to alleviate the simulation process. Some
approaches to build data-driven surrogates mimic the solver iterative process.
They infer the next state of the fluid given its previous one. Others directly
infer the state from time input. Approaches also differ in their management of
the spatial information. Graph Neural Networks (GNN) can address the
specificity of the irregular meshes commonly used in CFD simulations. In this
article, we present our ongoing work to design a novel direct time GNN
architecture for irregular meshes. It consists of a succession of graphs of
increasing size connected by spline convolutions. We test our architecture on
the Von K{\'a}rm{\'a}n's vortex street benchmark. It achieves small
generalization errors while mitigating error accumulation along the trajectory.
Related papers
- FMint: Bridging Human Designed and Data Pretrained Models for Differential Equation Foundation Model [5.748690310135373]
We propose a novel multi-modal foundation model, named textbfFMint, to bridge the gap between human-designed and data-driven models.
Built on a decoder-only transformer architecture with in-context learning, FMint utilizes both numerical and textual data to learn a universal error correction scheme.
Our results demonstrate the effectiveness of the proposed model in terms of both accuracy and efficiency compared to classical numerical solvers.
arXiv Detail & Related papers (2024-04-23T02:36:47Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Graph Convolutional Networks for Simulating Multi-phase Flow and Transport in Porous Media [0.0]
Data-driven surrogate modeling provides inexpensive alternatives to high-fidelity numerical simulators.
CNNs are powerful in approximating partial differential equation solutions, but it remains challenging for CNNs to handle irregular and unstructured simulation meshes.
We construct surrogate models based on Graph Convolutional Networks (GCNs) to approximate the spatial-temporal solutions of multi-phase flow and transport processes in porous media.
arXiv Detail & Related papers (2023-07-10T09:59:35Z) - Temporal Subsampling Diminishes Small Spatial Scales in Recurrent Neural
Network Emulators of Geophysical Turbulence [0.0]
We investigate how an often overlooked processing step affects the quality of an emulator's predictions.
We implement ML architectures from a class of methods called reservoir computing: (1) a form of spatial Vector Autoregression (N VAR), and (2) an Echo State Network (ESN)
In all cases, subsampling the training data consistently leads to an increased bias at small scales that resembles numerical diffusion.
arXiv Detail & Related papers (2023-04-28T21:34:53Z) - Machine Learning model for gas-liquid interface reconstruction in CFD
numerical simulations [59.84561168501493]
The volume of fluid (VoF) method is widely used in multi-phase flow simulations to track and locate the interface between two immiscible fluids.
A major bottleneck of the VoF method is the interface reconstruction step due to its high computational cost and low accuracy on unstructured grids.
We propose a machine learning enhanced VoF method based on Graph Neural Networks (GNN) to accelerate the interface reconstruction on general unstructured meshes.
arXiv Detail & Related papers (2022-07-12T17:07:46Z) - Simulating Liquids with Graph Networks [25.013244956897832]
We investigate graph neural networks (GNNs) for learning fluid dynamics.
Our results indicate that learning models, such as GNNs, fail to learn the exact underlying dynamics unless the training set is devoid of any other problem-specific correlations.
arXiv Detail & Related papers (2022-03-14T15:39:27Z) - A Gradient-based Deep Neural Network Model for Simulating Multiphase
Flow in Porous Media [1.5791732557395552]
We describe a gradient-based deep neural network (GDNN) constrained by the physics related to multiphase flow in porous media.
We demonstrate that GDNN can effectively predict the nonlinear patterns of subsurface responses.
arXiv Detail & Related papers (2021-04-30T02:14:00Z) - Machine learning for rapid discovery of laminar flow channel wall
modifications that enhance heat transfer [56.34005280792013]
We present a combination of accurate numerical simulations of arbitrary, flat, and non-flat channels and machine learning models predicting drag coefficient and Stanton number.
We show that convolutional neural networks (CNN) can accurately predict the target properties at a fraction of the time of numerical simulations.
arXiv Detail & Related papers (2021-01-19T16:14:02Z) - Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid
Flow Prediction [79.81193813215872]
We develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.
We show that we can both generalize well to new situations and benefit from the substantial speedup of neural network CFD predictions.
arXiv Detail & Related papers (2020-07-08T21:23:19Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.