Graph Neural Networks for Airfoil Design
- URL: http://arxiv.org/abs/2305.05469v1
- Date: Tue, 9 May 2023 14:15:55 GMT
- Title: Graph Neural Networks for Airfoil Design
- Authors: Florent Bonnet
- Abstract summary: We propose an adaptation of a known architecture to tackle the task of approximating the solution of the two-dimensional steady-state Navier-Stokes equations over different airfoils.
This work takes place in a longer project that aims to approximate three dimensional steady-state solutions over industrial geometries.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The study of partial differential equations (PDE) through the framework of
deep learning emerged a few years ago leading to the impressive approximations
of simple dynamics. Graph neural networks (GNN) turned out to be very useful in
those tasks by allowing the treatment of unstructured data often encountered in
the field of numerical resolutions of PDE. However, the resolutions of harder
PDE such as Navier-Stokes equations are still a challenging task and most of
the work done on the latter concentrate either on simulating the flow around
simple geometries or on qualitative results that looks physical for design
purpose. In this study, we try to leverage the work done on deep learning for
PDE and GNN by proposing an adaptation of a known architecture in order to
tackle the task of approximating the solution of the two-dimensional
steady-state incompressible Navier-Stokes equations over different airfoil
geometries. In addition to that, we test our model not only on its performance
over the volume but also on its performance to approximate surface quantities
such as the wall shear stress or the isostatic pressure leading to the
inference of global coefficients such as the lift and the drag of our airfoil
in order to allow design exploration. This work takes place in a longer project
that aims to approximate three dimensional steady-state solutions over
industrial geometries.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Physics-Informed Graph Convolutional Networks: Towards a generalized
framework for complex geometries [0.0]
We justify the use of graph neural networks for solving partial differential equations.
An alternative procedure is proposed, by combining classical numerical solvers and the Physics-Informed framework.
We propose an implementation of this approach, that we test on a three-dimensional problem on an irregular geometry.
arXiv Detail & Related papers (2023-10-20T09:46:12Z) - Efficient Neural PDE-Solvers using Quantization Aware Training [71.0934372968972]
We show that quantization can successfully lower the computational cost of inference while maintaining performance.
Our results on four standard PDE datasets and three network architectures show that quantization-aware training works across settings and three orders of FLOPs magnitudes.
arXiv Detail & Related papers (2023-08-14T09:21:19Z) - INFINITY: Neural Field Modeling for Reynolds-Averaged Navier-Stokes
Equations [13.242926257057084]
INFINITY is a deep learning model that encodes geometric information and physical fields into compact representations.
Our framework achieves state-of-the-art performance by accurately inferring physical fields throughout the volume and surface.
Our model can correctly predict drag and lift coefficients while adhering to the equations.
arXiv Detail & Related papers (2023-07-25T14:35:55Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - Physics-constrained Unsupervised Learning of Partial Differential
Equations using Meshes [1.066048003460524]
Graph neural networks show promise in accurately representing irregularly meshed objects and learning their dynamics.
In this work, we represent meshes naturally as graphs, process these using Graph Networks, and formulate our physics-based loss to provide an unsupervised learning framework for partial differential equations (PDE)
Our framework will enable the application of PDE solvers in interactive settings, such as model-based control of soft-body deformations.
arXiv Detail & Related papers (2022-03-30T19:22:56Z) - Physics-informed neural networks with hard constraints for inverse
design [3.8191831921441337]
We propose a new deep learning method -- physics-informed neural networks with hard constraints (hPINNs) -- for solving topology optimization.
We demonstrate the effectiveness of hPINN for a holography problem in optics and a fluid problem of Stokes flow.
arXiv Detail & Related papers (2021-02-09T03:18:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.