Learning the Solution Operator of Boundary Value Problems using Graph
Neural Networks
- URL: http://arxiv.org/abs/2206.14092v2
- Date: Thu, 17 Aug 2023 07:34:37 GMT
- Title: Learning the Solution Operator of Boundary Value Problems using Graph
Neural Networks
- Authors: Winfried L\"otzsch, Simon Ohler, Johannes S. Otterbach
- Abstract summary: We design a general solution operator for two different time-independent PDEs using graph neural networks (GNNs) and spectral graph convolutions.
We train the networks on simulated data from a finite elements solver on a variety of shapes and inhomogeneities.
We find that training on a diverse dataset with lots of variation in the finite element meshes is a key ingredient for achieving good generalization results.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As an alternative to classical numerical solvers for partial differential
equations (PDEs) subject to boundary value constraints, there has been a surge
of interest in investigating neural networks that can solve such problems
efficiently. In this work, we design a general solution operator for two
different time-independent PDEs using graph neural networks (GNNs) and spectral
graph convolutions. We train the networks on simulated data from a finite
elements solver on a variety of shapes and inhomogeneities. In contrast to
previous works, we focus on the ability of the trained operator to generalize
to previously unseen scenarios. Specifically, we test generalization to meshes
with different shapes and superposition of solutions for a different number of
inhomogeneities. We find that training on a diverse dataset with lots of
variation in the finite element meshes is a key ingredient for achieving good
generalization results in all cases. With this, we believe that GNNs can be
used to learn solution operators that generalize over a range of properties and
produce solutions much faster than a generic solver. Our dataset, which we make
publicly available, can be used and extended to verify the robustness of these
models under varying conditions.
Related papers
- Solving partial differential equations with sampled neural networks [1.8590821261905535]
Approximation of solutions to partial differential equations (PDE) is an important problem in computational science and engineering.
We discuss how sampling the hidden weights and biases of the ansatz network from data-agnostic and data-dependent probability distributions allows us to progress on both challenges.
arXiv Detail & Related papers (2024-05-31T14:24:39Z) - Transformers as Neural Operators for Solutions of Differential Equations with Finite Regularity [1.6874375111244329]
We first establish the theoretical groundwork that transformers possess the universal approximation property as operator learning models.
In particular, we consider three examples: the Izhikevich neuron model, the tempered fractional-order Leaky Integrate-and-Fire (LIFLIF) model, and the one-dimensional equation Euler problem.
arXiv Detail & Related papers (2024-05-29T15:10:24Z) - Reference Neural Operators: Learning the Smooth Dependence of Solutions of PDEs on Geometric Deformations [13.208548352092455]
For partial differential equations on domains of arbitrary shapes, existing works of neural operators attempt to learn a mapping from geometries to solutions.
We propose reference neural operators (RNO) to learn the smooth dependence of solutions on geometric deformations.
RNO outperforms baseline models in accuracy by a large lead and achieves up to 80% error reduction.
arXiv Detail & Related papers (2024-05-27T06:50:17Z) - GIT-Net: Generalized Integral Transform for Operator Learning [58.13313857603536]
This article introduces GIT-Net, a deep neural network architecture for approximating Partial Differential Equation (PDE) operators.
GIT-Net harnesses the fact that differential operators commonly used for defining PDEs can often be represented parsimoniously when expressed in specialized functional bases.
Numerical experiments demonstrate that GIT-Net is a competitive neural network operator, exhibiting small test errors and low evaluations across a range of PDE problems.
arXiv Detail & Related papers (2023-12-05T03:03:54Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Sparse Deep Neural Network for Nonlinear Partial Differential Equations [3.0069322256338906]
This paper is devoted to a numerical study of adaptive approximation of solutions of nonlinear partial differential equations.
We develop deep neural networks (DNNs) with a sparse regularization with multiple parameters to represent functions having certain singularities.
Numerical examples confirm that solutions generated by the proposed SDNN are sparse and accurate.
arXiv Detail & Related papers (2022-07-27T03:12:16Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.