Method of data forward generation with partial differential equations for machine learning modeling in fluid mechanics
- URL: http://arxiv.org/abs/2501.03300v1
- Date: Mon, 06 Jan 2025 15:17:13 GMT
- Title: Method of data forward generation with partial differential equations for machine learning modeling in fluid mechanics
- Authors: Ruilin Chen, Xiaowei Jin, Nikolaus A. Adams, Hui Li,
- Abstract summary: This study proposes a high-efficient data forward generation method from the partial differential equations (PDEs)
A Poisson neural network (Poisson-NN) embedded in projection method and a wavelet transform convolutional neuro network (WTCNN) embedded in multigrid numerical simulation for solving incompressible Navier-Stokes equations is respectively proposed.
- Score: 1.9688252014450927
- License:
- Abstract: Artificial intelligence (AI) for fluid mechanics has become attractive topic. High-fidelity data is one of most critical issues for the successful applications of AI in fluid mechanics, however, it is expensively obtained or even inaccessible. This study proposes a high-efficient data forward generation method from the partial differential equations (PDEs). Specifically, the solutions of the PDEs are first generated either following a random field (e.g. Gaussian random field, GRF, computational complexity O(NlogN), N is the number of spatial points) or physical laws (e.g. a kind of spectra, computational complexity O(NM), M is the number of modes), then the source terms, boundary conditions and initial conditions are computed to satisfy PDEs. Thus, the data pairs of source terms, boundary conditions and initial conditions with corresponding solutions of PDEs can be constructed. A Poisson neural network (Poisson-NN) embedded in projection method and a wavelet transform convolutional neuro network (WTCNN) embedded in multigrid numerical simulation for solving incompressible Navier-Stokes equations is respectively proposed. The feasibility of generated data for training Poisson-NN and WTCNN is validated. The results indicate that even without any DNS data, the generated data can train these two models with excellent generalization and accuracy. The data following physical laws can significantly improve the convergence rate, generalization and accuracy than that generated following GRF.
Related papers
- Discovery of Quasi-Integrable Equations from traveling-wave data using the Physics-Informed Neural Networks [0.0]
PINNs are used to study vortex solutions in 2+1 dimensional nonlinear partial differential equations.
We consider PINNs with conservation laws (referred to as cPINNs), deformations of the initial profiles, and a friction approach to improve the identification's resolution.
arXiv Detail & Related papers (2024-10-23T08:29:13Z) - Score-based Diffusion Models in Function Space [137.70916238028306]
Diffusion models have recently emerged as a powerful framework for generative modeling.
This work introduces a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Stacked Generative Machine Learning Models for Fast Approximations of
Steady-State Navier-Stokes Equations [1.4150517264592128]
We develop a weakly-supervised approach to solve the steady-state Navier-Stokes equations under various boundary conditions.
We achieve state-of-the-art results without any labeled simulation data.
We train stacked models of increasing complexity generating the numerical solutions for N-S equations.
arXiv Detail & Related papers (2021-12-13T05:08:55Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Physics-Informed Neural Network Method for Solving One-Dimensional
Advection Equation Using PyTorch [0.0]
PINNs approach allows training neural networks while respecting the PDEs as a strong constraint in the optimization.
In standard small-scale circulation simulations, it is shown that the conventional approach incorporates a pseudo diffusive effect that is almost as large as the effect of the turbulent diffusion model.
Of all the schemes tested, only the PINNs approximation accurately predicted the outcome.
arXiv Detail & Related papers (2021-03-15T05:39:17Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Physics informed deep learning for computational elastodynamics without
labeled data [13.084113582897965]
We present a physics-informed neural network (PINN) with mixed-variable output to model elastodynamics problems without resort to labeled data.
Results show the promise of PINN in the context of computational mechanics applications.
arXiv Detail & Related papers (2020-06-10T19:05:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.