Neural Operators for Accelerating Scientific Simulations and Design
- URL: http://arxiv.org/abs/2309.15325v5
- Date: Thu, 4 Jan 2024 20:38:03 GMT
- Title: Neural Operators for Accelerating Scientific Simulations and Design
- Authors: Kamyar Azizzadenesheli, Nikola Kovachki, Zongyi Li, Miguel
Liu-Schiaffini, Jean Kossaifi, Anima Anandkumar
- Abstract summary: An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
- Score: 85.89660065887956
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Scientific discovery and engineering design are currently limited by the time
and cost of physical experiments, selected mostly through trial-and-error and
intuition that require deep domain expertise. Numerical simulations present an
alternative to physical experiments but are usually infeasible for complex
real-world domains due to the computational requirements of existing numerical
methods. Artificial intelligence (AI) presents a potential paradigm shift by
developing fast data-driven surrogate models. In particular, an AI framework,
known as Neural Operators, presents a principled framework for learning
mappings between functions defined on continuous domains, e.g., spatiotemporal
processes and partial differential equations (PDE). They can extrapolate and
predict solutions at new locations unseen during training, i.e., perform
zero-shot super-resolution. Neural Operators can augment or even replace
existing simulators in many applications, such as computational fluid dynamics,
weather forecasting, and material modeling, while being 4-5 orders of magnitude
faster. Further, Neural Operators can be integrated with physics and other
domain constraints enforced at finer resolutions to obtain high-fidelity
solutions and good generalization. Since Neural Operators are differentiable,
they can directly optimize parameters for inverse design and other inverse
problems. We believe that Neural Operators present a transformative approach to
simulation and design, enabling rapid research and development.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Liquid Fourier Latent Dynamics Networks for fast GPU-based numerical simulations in computational cardiology [0.0]
We propose an extension of Latent Dynamics Networks (LDNets) to create parameterized space-time surrogate models for multiscale and multiphysics sets of highly nonlinear differential equations on complex geometries.
LFLDNets employ a neurologically-inspired, sparse liquid neural network for temporal dynamics, relaxing the requirement of a numerical solver for time advancement and leading to superior performance in terms of parameters, accuracy, efficiency and learned trajectories.
arXiv Detail & Related papers (2024-08-19T09:14:25Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - A foundational neural operator that continuously learns without
forgetting [1.0878040851638]
We introduce the concept of the Neural Combinatorial Wavelet Neural Operator (NCWNO) as a foundational model for scientific computing.
The NCWNO is specifically designed to excel in learning from a diverse spectrum of physics and continuously adapt to the solution operators associated with parametric partial differential equations (PDEs)
The proposed foundational model offers two key advantages: (i) it can simultaneously learn solution operators for multiple parametric PDEs, and (ii) it can swiftly generalize to new parametric PDEs with minimal fine-tuning.
arXiv Detail & Related papers (2023-10-29T03:20:10Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Generalized Neural Closure Models with Interpretability [28.269731698116257]
We develop a novel and versatile methodology of unified neural partial delay differential equations.
We augment existing/low-fidelity dynamical models directly in their partial differential equation (PDE) forms with both Markovian and non-Markovian neural network (NN) closure parameterizations.
We demonstrate the new generalized neural closure models (gnCMs) framework using four sets of experiments based on advecting nonlinear waves, shocks, and ocean acidification models.
arXiv Detail & Related papers (2023-01-15T21:57:43Z) - On Fast Simulation of Dynamical System with Neural Vector Enhanced
Numerical Solver [59.13397937903832]
We introduce a deep learning-based corrector called Neural Vector (NeurVec)
NeurVec can compensate for integration errors and enable larger time step sizes in simulations.
Our experiments on a variety of complex dynamical system benchmarks demonstrate that NeurVec exhibits remarkable generalization capability.
arXiv Detail & Related papers (2022-08-07T09:02:18Z) - Seismic wave propagation and inversion with Neural Operators [7.296366040398878]
We develop a prototype framework for learning general solutions using a recently developed machine learning paradigm called Neural Operator.
A trained Neural Operator can compute a solution in negligible time for any velocity structure or source location.
We illustrate the method with the 2D acoustic wave equation and demonstrate the method's applicability to seismic tomography.
arXiv Detail & Related papers (2021-08-11T19:17:39Z) - Long-time integration of parametric evolution equations with
physics-informed DeepONets [0.0]
We introduce an effective framework for learning infinite-dimensional operators that map random initial conditions to associated PDE solutions within a short time interval.
Global long-time predictions across a range of initial conditions can be then obtained by iteratively evaluating the trained model.
This introduces a new approach to temporal domain decomposition that is shown to be effective in performing accurate long-time simulations.
arXiv Detail & Related papers (2021-06-09T20:46:17Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.