Geometry-Informed Neural Operator for Large-Scale 3D PDEs
- URL: http://arxiv.org/abs/2309.00583v1
- Date: Fri, 1 Sep 2023 16:59:21 GMT
- Title: Geometry-Informed Neural Operator for Large-Scale 3D PDEs
- Authors: Zongyi Li, Nikola Borislavov Kovachki, Chris Choy, Boyi Li, Jean
Kossaifi, Shourya Prakash Otta, Mohammad Amin Nabian, Maximilian Stadler,
Christian Hundt, Kamyar Azizzadenesheli, Anima Anandkumar
- Abstract summary: We propose the geometry-informed neural operator (GINO) to learn the solution operator of large-scale partial differential equations.
We successfully trained GINO to predict the pressure on car surfaces using only five hundred data points.
- Score: 76.06115572844882
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose the geometry-informed neural operator (GINO), a highly efficient
approach to learning the solution operator of large-scale partial differential
equations with varying geometries. GINO uses a signed distance function and
point-cloud representations of the input shape and neural operators based on
graph and Fourier architectures to learn the solution operator. The graph
neural operator handles irregular grids and transforms them into and from
regular latent grids on which Fourier neural operator can be efficiently
applied. GINO is discretization-convergent, meaning the trained model can be
applied to arbitrary discretization of the continuous domain and it converges
to the continuum operator as the discretization is refined. To empirically
validate the performance of our method on large-scale simulation, we generate
the industry-standard aerodynamics dataset of 3D vehicle geometries with
Reynolds numbers as high as five million. For this large-scale 3D fluid
simulation, numerical methods are expensive to compute surface pressure. We
successfully trained GINO to predict the pressure on car surfaces using only
five hundred data points. The cost-accuracy experiments show a $26,000 \times$
speed-up compared to optimized GPU-based computational fluid dynamics (CFD)
simulators on computing the drag coefficient. When tested on new combinations
of geometries and boundary conditions (inlet velocities), GINO obtains a
one-fourth reduction in error rate compared to deep neural network approaches.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Efficient and generalizable nested Fourier-DeepONet for three-dimensional geological carbon sequestration [5.77922305904338]
Surrogate modeling with data-driven machine learning has become a promising alternative to accelerate physics-based simulations.
We have developed a nested Fourier-DeepONet by combining the expressiveness of the FNO with the modularity of a deep operator network (DeepONet)
This new framework is twice as efficient as a nested FNO for training and has at least 80% lower GPU memory requirement.
arXiv Detail & Related papers (2024-09-25T02:58:45Z) - Aero-Nef: Neural Fields for Rapid Aircraft Aerodynamics Simulations [1.1932047172700866]
This paper presents a methodology to learn surrogate models of steady state fluid dynamics simulations on meshed domains.
The proposed models can be applied directly to unstructured domains for different flow conditions.
Remarkably, the method can perform inference five order of magnitude faster than the high fidelity solver on the RANS transonic airfoil dataset.
arXiv Detail & Related papers (2024-07-29T11:48:44Z) - Latent Neural Operator for Solving Forward and Inverse PDE Problems [5.8039987932401225]
We present the Latent Neural Operator (LNO) solving PDEs in the latent space.
Experiments show that LNO reduces the GPU memory by 50%, speeds up training 1.8 times, and reaches state-of-the-art accuracy on four out of six benchmarks.
arXiv Detail & Related papers (2024-06-06T10:04:53Z) - Reference Neural Operators: Learning the Smooth Dependence of Solutions of PDEs on Geometric Deformations [13.208548352092455]
For partial differential equations on domains of arbitrary shapes, existing works of neural operators attempt to learn a mapping from geometries to solutions.
We propose reference neural operators (RNO) to learn the smooth dependence of solutions on geometric deformations.
RNO outperforms baseline models in accuracy by a large lead and achieves up to 80% error reduction.
arXiv Detail & Related papers (2024-05-27T06:50:17Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - SciAI4Industry -- Solving PDEs for industry-scale problems with deep
learning [1.642765885524881]
We introduce a distributed programming API for simulating training data in parallel on the cloud without requiring users to manage the underlying HPC infrastructure.
We train large-scale neural networks for solving the 3D Navier-Stokes equation and simulating 3D CO2 flow in porous media.
For the CO2 example, we simulate a training dataset based on a commercial carbon capture and storage (CCS) project and train a neural network for CO2 flow simulation on a 3D grid with over 2 million cells that is 5 orders of magnitudes faster than a conventional numerical simulator and 3,200 times cheaper.
arXiv Detail & Related papers (2022-11-23T05:15:32Z) - Learning Large-scale Subsurface Simulations with a Hybrid Graph Network
Simulator [57.57321628587564]
We introduce Hybrid Graph Network Simulator (HGNS) for learning reservoir simulations of 3D subsurface fluid flows.
HGNS consists of a subsurface graph neural network (SGNN) to model the evolution of fluid flows, and a 3D-U-Net to model the evolution of pressure.
Using an industry-standard subsurface flow dataset (SPE-10) with 1.1 million cells, we demonstrate that HGNS is able to reduce the inference time up to 18 times compared to standard subsurface simulators.
arXiv Detail & Related papers (2022-06-15T17:29:57Z) - Data-Driven Shadowgraph Simulation of a 3D Object [50.591267188664666]
We are replacing the numerical code by a computationally cheaper projection based surrogate model.
The model is able to approximate the electric fields at a given time without computing all preceding electric fields as required by numerical methods.
This model has shown a good quality reconstruction in a problem of perturbation of data within a narrow range of simulation parameters and can be used for input data of large size.
arXiv Detail & Related papers (2021-06-01T08:46:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.