Finite Basis Physics-Informed Neural Networks (FBPINNs): a scalable
domain decomposition approach for solving differential equations
- URL: http://arxiv.org/abs/2107.07871v1
- Date: Fri, 16 Jul 2021 13:03:47 GMT
- Title: Finite Basis Physics-Informed Neural Networks (FBPINNs): a scalable
domain decomposition approach for solving differential equations
- Authors: Ben Moseley, Andrew Markham, Tarje Nissen-Meyer
- Abstract summary: We propose a new, scalable approach for solving large problems relating to differential equations called Finite Basis PINNs (FBPINNs)
FBPINNs are inspired by classical finite element methods, where the solution of the differential equation is expressed as the sum of a finite set of basis functions with compact support.
In FBPINNs neural networks are used to learn these basis functions, which are defined over small, overlapping subdomain problems.
- Score: 20.277873724720987
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, physics-informed neural networks (PINNs) have offered a powerful
new paradigm for solving problems relating to differential equations. Compared
to classical numerical methods PINNs have several advantages, for example their
ability to provide mesh-free solutions of differential equations and their
ability to carry out forward and inverse modelling within the same optimisation
problem. Whilst promising, a key limitation to date is that PINNs have
struggled to accurately and efficiently solve problems with large domains
and/or multi-scale solutions, which is crucial for their real-world
application. Multiple significant and related factors contribute to this issue,
including the increasing complexity of the underlying PINN optimisation problem
as the problem size grows and the spectral bias of neural networks. In this
work we propose a new, scalable approach for solving large problems relating to
differential equations called Finite Basis PINNs (FBPINNs). FBPINNs are
inspired by classical finite element methods, where the solution of the
differential equation is expressed as the sum of a finite set of basis
functions with compact support. In FBPINNs neural networks are used to learn
these basis functions, which are defined over small, overlapping subdomains.
FBINNs are designed to address the spectral bias of neural networks by using
separate input normalisation over each subdomain, and reduce the complexity of
the underlying optimisation problem by using many smaller neural networks in a
parallel divide-and-conquer approach. Our numerical experiments show that
FBPINNs are effective in solving both small and larger, multi-scale problems,
outperforming standard PINNs in both accuracy and computational resources
required, potentially paving the way to the application of PINNs on large,
real-world problems.
Related papers
- Binary structured physics-informed neural networks for solving equations
with rapidly changing solutions [3.6415476576196055]
Physics-informed neural networks (PINNs) have emerged as a promising approach for solving partial differential equations (PDEs)
We propose a binary structured physics-informed neural network (BsPINN) framework, which employs binary structured neural network (BsNN) as the neural network component.
BsPINNs exhibit superior convergence speed and heightened accuracy compared to PINNs.
arXiv Detail & Related papers (2024-01-23T14:37:51Z) - Multifidelity domain decomposition-based physics-informed neural networks and operators for time-dependent problems [40.46280139210502]
A combination of multifidelity stacking PINNs and domain decomposition-based finite basis PINNs is employed.
It can be observed that the domain decomposition approach clearly improves the PINN and stacking PINN approaches.
It is demonstrated that the FBPINN approach can be extended to multifidelity physics-informed deep operator networks.
arXiv Detail & Related papers (2024-01-15T18:32:53Z) - PINNsFormer: A Transformer-Based Framework For Physics-Informed Neural Networks [22.39904196850583]
Physics-Informed Neural Networks (PINNs) have emerged as a promising deep learning framework for approximating numerical solutions to partial differential equations (PDEs)
We introduce a novel Transformer-based framework, termed PINNsFormer, designed to address this limitation.
PINNsFormer achieves superior generalization ability and accuracy across various scenarios, including PINNs failure modes and high-dimensional PDEs.
arXiv Detail & Related papers (2023-07-21T18:06:27Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - $\Delta$-PINNs: physics-informed neural networks on complex geometries [2.1485350418225244]
Physics-informed neural networks (PINNs) have demonstrated promise in solving forward and inverse problems involving partial differential equations.
To date, there is no clear way to inform PINNs about the topology of the domain where the problem is being solved.
We propose a novel positional encoding mechanism for PINNs based on the eigenfunctions of the Laplace-Beltrami operator.
arXiv Detail & Related papers (2022-09-08T18:03:19Z) - Semi-analytic PINN methods for singularly perturbed boundary value
problems [0.8594140167290099]
We propose a new semi-analytic physics informed neural network (PINN) to solve singularly perturbed boundary value problems.
The PINN is a scientific machine learning framework that offers a promising perspective for finding numerical solutions to partial differential equations.
arXiv Detail & Related papers (2022-08-19T04:26:40Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.