Multi-scale Physical Representations for Approximating PDE Solutions
with Graph Neural Operators
- URL: http://arxiv.org/abs/2206.14687v1
- Date: Wed, 29 Jun 2022 14:42:03 GMT
- Title: Multi-scale Physical Representations for Approximating PDE Solutions
with Graph Neural Operators
- Authors: L\'eon Migus, Yuan Yin, Jocelyn Ahmed Mazari, Patrick Gallinari
- Abstract summary: We study three multi-resolution schema with integral kernel operators approximated with emphMessage Passing Graph Neural Networks (MPGNNs)
To validate our study, we make extensive MPGNNs experiments with well-chosen metrics considering steady and unsteady PDEs.
- Score: 14.466945570499183
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Representing physical signals at different scales is among the most
challenging problems in engineering. Several multi-scale modeling tools have
been developed to describe physical systems governed by \emph{Partial
Differential Equations} (PDEs). These tools are at the crossroad of principled
physical models and numerical schema. Recently, data-driven models have been
introduced to speed-up the approximation of PDE solutions compared to numerical
solvers. Among these recent data-driven methods, neural integral operators are
a class that learn a mapping between function spaces. These functions are
discretized on graphs (meshes) which are appropriate for modeling interactions
in physical phenomena. In this work, we study three multi-resolution schema
with integral kernel operators that can be approximated with \emph{Message
Passing Graph Neural Networks} (MPGNNs). To validate our study, we make
extensive MPGNNs experiments with well-chosen metrics considering steady and
unsteady PDEs.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)
We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.
PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - Physics-informed Discretization-independent Deep Compositional Operator Network [1.2430809884830318]
We introduce a novel physics-informed model architecture which can generalize to various discrete representations of PDE parameters and irregular domain shapes.
Inspired by deep operator neural networks, our model involves a discretization-independent learning of parameter embedding repeatedly.
Numerical results demonstrate the accuracy and efficiency of the proposed method.
arXiv Detail & Related papers (2024-04-21T12:41:30Z) - Coupled Multiwavelet Neural Operator Learning for Coupled Partial
Differential Equations [13.337268390844745]
We propose a textitcoupled multiwavelets neural operator (CMWNO) learning scheme by decoupling the coupled integral kernels.
The proposed model achieves significantly higher accuracy compared to previous learning-based solvers.
arXiv Detail & Related papers (2023-03-04T03:06:47Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - KoopmanLab: machine learning for solving complex physics equations [7.815723299913228]
We present KoopmanLab, an efficient module of the Koopman neural operator family, for learning PDEs without analytic solutions or closed forms.
Our module consists of multiple variants of the Koopman neural operator (KNO), a kind of mesh-independent neural-network-based PDE solvers.
The compact variants of KNO can accurately solve PDEs with small model sizes while the large variants of KNO are more competitive in predicting highly complicated dynamic systems.
arXiv Detail & Related papers (2023-01-03T13:58:39Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Learning to Simulate Complex Physics with Graph Networks [68.43901833812448]
We present a machine learning framework and model implementation that can learn to simulate a wide variety of challenging physical domains.
Our framework---which we term "Graph Network-based Simulators" (GNS)--represents the state of a physical system with particles, expressed as nodes in a graph, and computes dynamics via learned message-passing.
Our results show that our model can generalize from single-timestep predictions with thousands of particles during training, to different initial conditions, thousands of timesteps, and at least an order of magnitude more particles at test time.
arXiv Detail & Related papers (2020-02-21T16:44:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.