SetPINNs: Set-based Physics-informed Neural Networks
- URL: http://arxiv.org/abs/2409.20206v1
- Date: Mon, 30 Sep 2024 11:41:58 GMT
- Title: SetPINNs: Set-based Physics-informed Neural Networks
- Authors: Mayank Nagda, Phil Ostheimer, Thomas Specht, Frank Rhein, Fabian Jirasek, Marius Kloft, Sophie Fellenz,
- Abstract summary: Physics-Informed Neural Networks (PINNs) have emerged as a promising method for approximating solutions to partial differential equations (PDEs) using deep learning.
We propose SetPINNs, a novel approach inspired by Finite Elements Methods from the field of Numerical Analysis.
We show that SetPINNs demonstrate superior generalization performance and accuracy across diverse physical systems.
- Score: 19.27512418720287
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Physics-Informed Neural Networks (PINNs) have emerged as a promising method for approximating solutions to partial differential equations (PDEs) using deep learning. However, PINNs, based on multilayer perceptrons (MLP), often employ point-wise predictions, overlooking the implicit dependencies within the physical system such as temporal or spatial dependencies. These dependencies can be captured using more complex network architectures, for example CNNs or Transformers. However, these architectures conventionally do not allow for incorporating physical constraints, as advancements in integrating such constraints within these frameworks are still lacking. Relying on point-wise predictions often results in trivial solutions. To address this limitation, we propose SetPINNs, a novel approach inspired by Finite Elements Methods from the field of Numerical Analysis. SetPINNs allow for incorporating the dependencies inherent in the physical system while at the same time allowing for incorporating the physical constraints. They accurately approximate PDE solutions of a region, thereby modeling the inherent dependencies between multiple neighboring points in that region. Our experiments show that SetPINNs demonstrate superior generalization performance and accuracy across diverse physical systems, showing that they mitigate failure modes and converge faster in comparison to existing approaches. Furthermore, we demonstrate the utility of SetPINNs on two real-world physical systems.
Related papers
- RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - Multifidelity domain decomposition-based physics-informed neural networks and operators for time-dependent problems [40.46280139210502]
A combination of multifidelity stacking PINNs and domain decomposition-based finite basis PINNs is employed.
It can be observed that the domain decomposition approach clearly improves the PINN and stacking PINN approaches.
It is demonstrated that the FBPINN approach can be extended to multifidelity physics-informed deep operator networks.
arXiv Detail & Related papers (2024-01-15T18:32:53Z) - PINNsFormer: A Transformer-Based Framework For Physics-Informed Neural Networks [22.39904196850583]
Physics-Informed Neural Networks (PINNs) have emerged as a promising deep learning framework for approximating numerical solutions to partial differential equations (PDEs)
We introduce a novel Transformer-based framework, termed PINNsFormer, designed to address this limitation.
PINNsFormer achieves superior generalization ability and accuracy across various scenarios, including PINNs failure modes and high-dimensional PDEs.
arXiv Detail & Related papers (2023-07-21T18:06:27Z) - On the Generalization of PINNs outside the training domain and the
Hyperparameters influencing it [1.3927943269211593]
PINNs are Neural Network architectures trained to emulate solutions of differential equations without the necessity of solution data.
We perform an empirical analysis of the behavior of PINN predictions outside their training domain.
We assess whether the algorithmic setup of PINNs can influence their potential for generalization and showcase the respective effect on the prediction.
arXiv Detail & Related papers (2023-02-15T09:51:56Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - On the Intrinsic Structures of Spiking Neural Networks [66.57589494713515]
Recent years have emerged a surge of interest in SNNs owing to their remarkable potential to handle time-dependent and event-driven data.
There has been a dearth of comprehensive studies examining the impact of intrinsic structures within spiking computations.
This work delves deep into the intrinsic structures of SNNs, by elucidating their influence on the expressivity of SNNs.
arXiv Detail & Related papers (2022-06-21T09:42:30Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z) - Robust Learning of Physics Informed Neural Networks [2.86989372262348]
Physics-informed Neural Networks (PINNs) have been shown to be effective in solving partial differential equations.
This paper shows that a PINN can be sensitive to errors in training data and overfit itself in dynamically propagating these errors over the domain of the solution of the PDE.
arXiv Detail & Related papers (2021-10-26T00:10:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.