Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs
- URL: http://arxiv.org/abs/2204.06255v2
- Date: Thu, 14 Apr 2022 06:39:06 GMT
- Title: Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs
- Authors: Peiyan Hu, Qi Meng, Bingguang Chen, Shiqi Gong, Yue Wang, Wei Chen,
Rongchan Zhu, Zhi-Ming Ma, Tie-Yan Liu
- Abstract summary: Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
- Score: 70.51212431290611
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stochastic partial differential equations (SPDEs) are significant tools for
modeling dynamics in many areas including atmospheric sciences and physics.
Neural Operators, generations of neural networks with capability of learning
maps between infinite-dimensional spaces, are strong tools for solving
parametric PDEs. However, they lack the ability to modeling SPDEs which usually
have poor regularity due to the driving noise. As the theory of regularity
structure has achieved great successes in analyzing SPDEs and provides the
concept model feature vectors that well-approximate SPDEs' solutions, we
propose the Neural Operator with Regularity Structure (NORS) which incorporates
the feature vectors for modeling dynamics driven by SPDEs. We conduct
experiments on various of SPDEs including the dynamic Phi41 model and the 2d
stochastic Navier-Stokes equation, and the results demonstrate that the NORS is
resolution-invariant, efficient, and achieves one order of magnitude lower
error with a modest amount of data.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)
We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.
PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - Neural Delay Differential Equations: System Reconstruction and Image
Classification [14.59919398960571]
We propose a new class of continuous-depth neural networks with delay, named Neural Delay Differential Equations (NDDEs)
Compared to NODEs, NDDEs have a stronger capacity of nonlinear representations.
We achieve lower loss and higher accuracy not only for the data produced synthetically but also for the CIFAR10, a well-known image dataset.
arXiv Detail & Related papers (2023-04-11T16:09:28Z) - Learning PDE Solution Operator for Continuous Modeling of Time-Series [1.39661494747879]
This work presents a partial differential equation (PDE) based framework which improves the dynamics modeling capability.
We propose a neural operator that can handle time continuously without requiring iterative operations or specific grids of temporal discretization.
Our framework opens up a new way for a continuous representation of neural networks that can be readily adopted for real-world applications.
arXiv Detail & Related papers (2023-02-02T03:47:52Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - A predictive physics-aware hybrid reduced order model for reacting flows [65.73506571113623]
A new hybrid predictive Reduced Order Model (ROM) is proposed to solve reacting flow problems.
The number of degrees of freedom is reduced from thousands of temporal points to a few POD modes with their corresponding temporal coefficients.
Two different deep learning architectures have been tested to predict the temporal coefficients.
arXiv Detail & Related papers (2023-01-24T08:39:20Z) - Neural Generalized Ordinary Differential Equations with Layer-varying
Parameters [1.3691539554014036]
We show that the layer-varying Neural-GODE is more flexible and general than the standard Neural-ODE.
The Neural-GODE enjoys the computational and memory benefits while performing comparably to ResNets in prediction accuracy.
arXiv Detail & Related papers (2022-09-21T20:02:28Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - LordNet: An Efficient Neural Network for Learning to Solve Parametric Partial Differential Equations without Simulated Data [47.49194807524502]
We propose LordNet, a tunable and efficient neural network for modeling entanglements.
The experiments on solving Poisson's equation and (2D and 3D) Navier-Stokes equation demonstrate that the long-range entanglements can be well modeled by the LordNet.
arXiv Detail & Related papers (2022-06-19T14:41:08Z) - Neural Stochastic Partial Differential Equations [1.2183405753834562]
We introduce the Neural SPDE model providing an extension to two important classes of physics-inspired neural architectures.
On the one hand, it extends all the popular neural -- ordinary, controlled, rough -- differential equation models in that it is capable of processing incoming information.
On the other hand, it extends Neural Operators -- recent generalizations of neural networks modelling mappings between functional spaces -- in that it can be used to learn complex SPDE solution operators.
arXiv Detail & Related papers (2021-10-19T20:35:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.