$\Delta$-PINNs: physics-informed neural networks on complex geometries
- URL: http://arxiv.org/abs/2209.03984v1
- Date: Thu, 8 Sep 2022 18:03:19 GMT
- Title: $\Delta$-PINNs: physics-informed neural networks on complex geometries
- Authors: Francisco Sahli Costabal, Simone Pezzuto, Paris Perdikaris
- Abstract summary: Physics-informed neural networks (PINNs) have demonstrated promise in solving forward and inverse problems involving partial differential equations.
To date, there is no clear way to inform PINNs about the topology of the domain where the problem is being solved.
We propose a novel positional encoding mechanism for PINNs based on the eigenfunctions of the Laplace-Beltrami operator.
- Score: 2.1485350418225244
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physics-informed neural networks (PINNs) have demonstrated promise in solving
forward and inverse problems involving partial differential equations. Despite
recent progress on expanding the class of problems that can be tackled by
PINNs, most of existing use-cases involve simple geometric domains. To date,
there is no clear way to inform PINNs about the topology of the domain where
the problem is being solved. In this work, we propose a novel positional
encoding mechanism for PINNs based on the eigenfunctions of the
Laplace-Beltrami operator. This technique allows to create an input space for
the neural network that represents the geometry of a given object. We
approximate the eigenfunctions as well as the operators involved in the partial
differential equations with finite elements. We extensively test and compare
the proposed methodology against traditional PINNs in complex shapes, such as a
coil, a heat sink and a bunny, with different physics, such as the Eikonal
equation and heat transfer. We also study the sensitivity of our method to the
number of eigenfunctions used, as well as the discretization used for the
eigenfunctions and the underlying operators. Our results show excellent
agreement with the ground truth data in cases where traditional PINNs fail to
produce a meaningful solution. We envision this new technique will expand the
effectiveness of PINNs to more realistic applications.
Related papers
- Physics-informed PointNet: On how many irregular geometries can it solve
an inverse problem simultaneously? Application to linear elasticity [58.44709568277582]
Physics-informed PointNet (PIPN) is designed to fill this gap between PINNs and fully supervised learning models.
We show that PIPN predicts the solution of desired partial differential equations over a few hundred domains simultaneously.
Specifically, we show that PIPN predicts the solution of a plane stress problem over more than 500 domains with different geometries, simultaneously.
arXiv Detail & Related papers (2023-03-22T06:49:34Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Semi-analytic PINN methods for singularly perturbed boundary value
problems [0.8594140167290099]
We propose a new semi-analytic physics informed neural network (PINN) to solve singularly perturbed boundary value problems.
The PINN is a scientific machine learning framework that offers a promising perspective for finding numerical solutions to partial differential equations.
arXiv Detail & Related papers (2022-08-19T04:26:40Z) - PhyGNNet: Solving spatiotemporal PDEs with Physics-informed Graph Neural
Network [12.385926494640932]
We propose PhyGNNet for solving partial differential equations on the basics of a graph neural network.
In particular, we divide the computing area into regular grids, define partial differential operators on the grids, then construct pde loss for the network to optimize to build PhyGNNet model.
arXiv Detail & Related papers (2022-08-07T13:33:34Z) - Enforcing Continuous Physical Symmetries in Deep Learning Network for
Solving Partial Differential Equations [3.6317085868198467]
We introduce a new method, symmetry-enhanced physics informed neural network (SPINN) where the invariant surface conditions induced by the Lie symmetries of PDEs are embedded into the loss function of PINN.
We show that SPINN performs better than PINN with fewer training points and simpler architecture of neural network.
arXiv Detail & Related papers (2022-06-19T00:44:22Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Finite Basis Physics-Informed Neural Networks (FBPINNs): a scalable
domain decomposition approach for solving differential equations [20.277873724720987]
We propose a new, scalable approach for solving large problems relating to differential equations called Finite Basis PINNs (FBPINNs)
FBPINNs are inspired by classical finite element methods, where the solution of the differential equation is expressed as the sum of a finite set of basis functions with compact support.
In FBPINNs neural networks are used to learn these basis functions, which are defined over small, overlapping subdomain problems.
arXiv Detail & Related papers (2021-07-16T13:03:47Z) - On the eigenvector bias of Fourier feature networks: From regression to
solving multi-scale PDEs with physics-informed neural networks [0.0]
We show that neural networks (PINNs) struggle in cases where the target functions to be approximated exhibit high-frequency or multi-scale features.
We construct novel architectures that employ multi-scale random observational features and justify how such coordinate embedding layers can lead to robust and accurate PINN models.
arXiv Detail & Related papers (2020-12-18T04:19:30Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.