Applications of physics informed neural operators
- URL: http://arxiv.org/abs/2203.12634v1
- Date: Wed, 23 Mar 2022 18:00:05 GMT
- Title: Applications of physics informed neural operators
- Authors: Shawn G. Rosofsky, E. A. Huerta
- Abstract summary: We present an end-to-end framework to learn partial differential equations.
We first demonstrate that our methods reproduce the accuracy and performance of other neural operators.
We apply our physics-informed neural operators to learn new types of equations, including the 2D Burgers equation.
- Score: 2.588973722689844
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present an end-to-end framework to learn partial differential equations
that brings together initial data production, selection of boundary conditions,
and the use of physics-informed neural operators to solve partial differential
equations that are ubiquitous in the study and modeling of physics phenomena.
We first demonstrate that our methods reproduce the accuracy and performance of
other neural operators published elsewhere in the literature to learn the 1D
wave equation and the 1D Burgers equation. Thereafter, we apply our
physics-informed neural operators to learn new types of equations, including
the 2D Burgers equation in the scalar, inviscid and vector types. Finally, we
show that our approach is also applicable to learn the physics of the 2D linear
and nonlinear shallow water equations, which involve three coupled partial
differential equations. We release our artificial intelligence surrogates and
scientific software to produce initial data and boundary conditions to study a
broad range of physically motivated scenarios. We provide the source code, an
interactive website to visualize the predictions of our physics informed neural
operators, and a tutorial for their use at the Data and Learning Hub for
Science.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Physics-informed nonlinear vector autoregressive models for the prediction of dynamical systems [0.36248657646376703]
We focus on one class of models called nonlinear vector autoregression (N VAR) to solve ordinary differential equations (ODEs)
Motivated by connections to numerical integration and physics-informed neural networks, we explicitly derive the physics-informed N VAR.
Because N VAR and piN VAR completely share their learned parameters, we propose an augmented procedure to jointly train the two models.
We evaluate the ability of the piN VAR model to predict solutions to various ODE systems, such as the undamped spring, a Lotka-Volterra predator-prey nonlinear model, and the chaotic Lorenz system.
arXiv Detail & Related papers (2024-07-25T14:10:42Z) - Latent Intuitive Physics: Learning to Transfer Hidden Physics from A 3D Video [58.043569985784806]
We introduce latent intuitive physics, a transfer learning framework for physics simulation.
It can infer hidden properties of fluids from a single 3D video and simulate the observed fluid in novel scenes.
We validate our model in three ways: (i) novel scene simulation with the learned visual-world physics, (ii) future prediction of the observed fluid dynamics, and (iii) supervised particle simulation.
arXiv Detail & Related papers (2024-06-18T16:37:44Z) - PICL: Physics Informed Contrastive Learning for Partial Differential Equations [7.136205674624813]
We develop a novel contrastive pretraining framework that improves neural operator generalization across multiple governing equations simultaneously.
A combination of physics-informed system evolution and latent-space model output are anchored to input data and used in our distance function.
We find that physics-informed contrastive pretraining improves accuracy for the Fourier Neural Operator in fixed-future and autoregressive rollout tasks for the 1D and 2D Heat, Burgers', and linear advection equations.
arXiv Detail & Related papers (2024-01-29T17:32:22Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - Learning differential equations from data [0.0]
In recent times, due to the abundance of data, there is an active search for data-driven methods to learn Differential equation models from data.
We propose a forward-Euler based neural network model and test its performance by learning ODEs from data using different number of hidden layers and different neural network width.
arXiv Detail & Related papers (2022-05-23T17:36:28Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Climate Modeling with Neural Diffusion Equations [3.8521112392276]
We design a novel climate model based on the neural ordinary differential equation (NODE) and the diffusion equation.
Our method consistently outperforms existing baselines by non-trivial margins.
arXiv Detail & Related papers (2021-11-11T01:48:46Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.