Learnable Activation Functions in Physics-Informed Neural Networks for Solving Partial Differential Equations
- URL: http://arxiv.org/abs/2411.15111v1
- Date: Fri, 22 Nov 2024 18:25:13 GMT
- Title: Learnable Activation Functions in Physics-Informed Neural Networks for Solving Partial Differential Equations
- Authors: Afrah Fareaa, Mustafa Serdar Celebi,
- Abstract summary: We investigate the use of learnable activation functions in Physics-Informed Networks (PINNs) for solving Partial Differential Equations (PDEs)
We compare the efficacy of traditional Multilayer Perceptrons (MLPs) with fixed and learnable activations against Kolmogorov-Arnold Neural Networks (KANs)
The findings offer insights into the design of neural network architectures that balance training efficiency, convergence speed, and test accuracy for PDE solvers.
- Score: 0.0
- License:
- Abstract: We investigate the use of learnable activation functions in Physics-Informed Neural Networks (PINNs) for solving Partial Differential Equations (PDEs). Specifically, we compare the efficacy of traditional Multilayer Perceptrons (MLPs) with fixed and learnable activations against Kolmogorov-Arnold Networks (KANs), which employ learnable basis functions. Physics-informed neural networks (PINNs) have emerged as an effective method for directly incorporating physical laws into the learning process, offering a data-efficient solution for both the forward and inverse problems associated with PDEs. However, challenges such as effective training and spectral bias, where low-frequency components are learned more effectively, often limit their applicability to problems characterized by rapid oscillations or sharp transitions. By employing different activation or basis functions on MLP and KAN, we assess their impact on convergence behavior and spectral bias mitigation, and the accurate approximation of PDEs. The findings offer insights into the design of neural network architectures that balance training efficiency, convergence speed, and test accuracy for PDE solvers. By evaluating the influence of activation or basis function choices, this work provides guidelines for developing more robust and accurate PINN models. The source code and pre-trained models used in this study are made publicly available to facilitate reproducibility and future exploration.
Related papers
- SPIKANs: Separable Physics-Informed Kolmogorov-Arnold Networks [0.9999629695552196]
Physics-Informed Neural Networks (PINNs) have emerged as a promising method for solving partial differential equations (PDEs)
We introduce Separable Physics-Informed Kolmogorov-Arnold Networks (SPIKANs)
This novel architecture applies the principle of separation of variables to PIKANs, decomposing the problem such that each dimension is handled by an individual KAN.
arXiv Detail & Related papers (2024-11-09T21:10:23Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - DeltaPhi: Learning Physical Trajectory Residual for PDE Solving [54.13671100638092]
We propose and formulate the Physical Trajectory Residual Learning (DeltaPhi)
We learn the surrogate model for the residual operator mapping based on existing neural operator networks.
We conclude that, compared to direct learning, physical residual learning is preferred for PDE solving.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - VS-PINN: A fast and efficient training of physics-informed neural networks using variable-scaling methods for solving PDEs with stiff behavior [0.0]
We propose a new method for training PINNs using variable-scaling techniques.
We will demonstrate the effectiveness of the proposed method for these problems and confirm that it can significantly improve the training efficiency and performance of PINNs.
arXiv Detail & Related papers (2024-06-10T14:11:15Z) - Transport Equation based Physics Informed Neural Network to predict the
Yield Strength of Architected Materials [0.0]
The PINN model showcases exceptional generalization capabilities, indicating its capacity to avoid overfitting with the provided dataset.
The research underscores the importance of striking a balance between performance and computational efficiency while selecting an activation function for specific real-world applications.
arXiv Detail & Related papers (2023-07-29T12:42:03Z) - Auxiliary-Tasks Learning for Physics-Informed Neural Network-Based
Partial Differential Equations Solving [13.196871939441273]
Physics-informed neural networks (PINNs) have emerged as promising surrogate modes for solving partial differential equations (PDEs)
We propose auxiliary-task learning-based ATL-PINNs, which provide four different auxiliary-task learning modes.
Our findings show that the proposed auxiliary-task learning modes can significantly improve solution accuracy, achieving a maximum performance boost of 96.62%.
arXiv Detail & Related papers (2023-07-12T13:46:40Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - A Physics Informed Neural Network Approach to Solution and
Identification of Biharmonic Equations of Elasticity [0.0]
We explore an application of the Physics Informed Neural Networks (PINNs) in conjunction with Airy stress functions and Fourier series.
We find that enriching feature space using Airy stress functions can significantly improve the accuracy of PINN solutions for biharmonic PDEs.
arXiv Detail & Related papers (2021-08-16T17:19:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.