Physics-Embedded Neural Networks: Graph Neural PDE Solvers with Mixed
Boundary Conditions
- URL: http://arxiv.org/abs/2205.11912v2
- Date: Thu, 23 Mar 2023 17:24:04 GMT
- Title: Physics-Embedded Neural Networks: Graph Neural PDE Solvers with Mixed
Boundary Conditions
- Authors: Masanobu Horie and Naoto Mitsume
- Abstract summary: Graph neural network (GNN) is a promising approach to learning and predicting physical phenomena.
We present a physics-embedded GNN that considers boundary conditions and predicts the state after a long time.
Our model can be a useful standard for realizing reliable, fast, and accurate GNN-based PDE solvers.
- Score: 3.04585143845864
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural network (GNN) is a promising approach to learning and predicting
physical phenomena described in boundary value problems, such as partial
differential equations (PDEs) with boundary conditions. However, existing
models inadequately treat boundary conditions essential for the reliable
prediction of such problems. In addition, because of the locally connected
nature of GNNs, it is difficult to accurately predict the state after a long
time, where interaction between vertices tends to be global. We present our
approach termed physics-embedded neural networks that considers boundary
conditions and predicts the state after a long time using an implicit method.
It is built based on an E(n)-equivariant GNN, resulting in high generalization
performance on various shapes. We demonstrate that our model learns flow
phenomena in complex shapes and outperforms a well-optimized classical solver
and a state-of-the-art machine learning model in speed-accuracy trade-off.
Therefore, our model can be a useful standard for realizing reliable, fast, and
accurate GNN-based PDE solvers. The code is available at
https://github.com/yellowshippo/penn-neurips2022.
Related papers
- PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)
We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.
PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - An Implicit GNN Solver for Poisson-like problems [2.675158177232256]
$Psi$-GNN is a novel Graph Neural Network (GNN) approach for solving the ubiquitous Poisson PDE problems with mixed boundary conditions.
By leveraging the Implicit Layer Theory, $Psi$-GNN models an "infinitely" deep network.
arXiv Detail & Related papers (2023-02-06T10:08:42Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - Learning to Solve PDE-constrained Inverse Problems with Graph Networks [51.89325993156204]
In many application domains across science and engineering, we are interested in solving inverse problems with constraints defined by a partial differential equation (PDE)
Here we explore GNNs to solve such PDE-constrained inverse problems.
We demonstrate computational speedups of up to 90x using GNNs compared to principled solvers.
arXiv Detail & Related papers (2022-06-01T18:48:01Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - GrADE: A graph based data-driven solver for time-dependent nonlinear
partial differential equations [0.0]
We propose a novel framework referred to as the Graph Attention Differential Equation (GrADE) for solving time dependent nonlinear PDEs.
The proposed approach couples FNN, graph neural network, and recently developed Neural ODE framework.
Results obtained illustrate the capability of the proposed framework in modeling PDE and its scalability to larger domains without the need for retraining.
arXiv Detail & Related papers (2021-08-24T10:49:03Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Physics-Informed Neural Network Method for Solving One-Dimensional
Advection Equation Using PyTorch [0.0]
PINNs approach allows training neural networks while respecting the PDEs as a strong constraint in the optimization.
In standard small-scale circulation simulations, it is shown that the conventional approach incorporates a pseudo diffusive effect that is almost as large as the effect of the turbulent diffusion model.
Of all the schemes tested, only the PINNs approximation accurately predicted the outcome.
arXiv Detail & Related papers (2021-03-15T05:39:17Z) - ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations [86.41674945012369]
We develop a scalable and expressive Graph Neural Networks model, ForceNet, to approximate atomic forces.
Our proposed ForceNet is able to predict atomic forces more accurately than state-of-the-art physics-based GNNs.
arXiv Detail & Related papers (2021-03-02T03:09:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.