Learnable Activation Functions in Physics-Informed Neural Networks for Solving Partial Differential Equations
- URL: http://arxiv.org/abs/2411.15111v1
- Date: Fri, 22 Nov 2024 18:25:13 GMT
- Title: Learnable Activation Functions in Physics-Informed Neural Networks for Solving Partial Differential Equations
- Authors: Afrah Fareaa, Mustafa Serdar Celebi,
- Abstract summary: We investigate the use of learnable activation functions in Physics-Informed Networks (PINNs) for solving Partial Differential Equations (PDEs)
We compare the efficacy of traditional Multilayer Perceptrons (MLPs) with fixed and learnable activations against Kolmogorov-Arnold Neural Networks (KANs)
The findings offer insights into the design of neural network architectures that balance training efficiency, convergence speed, and test accuracy for PDE solvers.
- Score: 0.0
- License:
- Abstract: We investigate the use of learnable activation functions in Physics-Informed Neural Networks (PINNs) for solving Partial Differential Equations (PDEs). Specifically, we compare the efficacy of traditional Multilayer Perceptrons (MLPs) with fixed and learnable activations against Kolmogorov-Arnold Networks (KANs), which employ learnable basis functions. Physics-informed neural networks (PINNs) have emerged as an effective method for directly incorporating physical laws into the learning process, offering a data-efficient solution for both the forward and inverse problems associated with PDEs. However, challenges such as effective training and spectral bias, where low-frequency components are learned more effectively, often limit their applicability to problems characterized by rapid oscillations or sharp transitions. By employing different activation or basis functions on MLP and KAN, we assess their impact on convergence behavior and spectral bias mitigation, and the accurate approximation of PDEs. The findings offer insights into the design of neural network architectures that balance training efficiency, convergence speed, and test accuracy for PDE solvers. By evaluating the influence of activation or basis function choices, this work provides guidelines for developing more robust and accurate PINN models. The source code and pre-trained models used in this study are made publicly available to facilitate reproducibility and future exploration.
Related papers
- Towards a Foundation Model for Physics-Informed Neural Networks: Multi-PDE Learning with Active Sampling [0.0]
Physics-Informed Neural Networks (PINNs) have emerged as a powerful framework for solving partial differential equations (PDEs) by embedding physical laws into neural network training.
In this work, we explore the potential of a foundation PINN model capable of solving multiple PDEs within a unified architecture.
arXiv Detail & Related papers (2025-02-11T10:12:28Z) - Low Tensor-Rank Adaptation of Kolmogorov--Arnold Networks [70.06682043272377]
Kolmogorov--Arnold networks (KANs) have demonstrated their potential as an alternative to multi-layer perceptions (MLPs) in various domains.
We develop low tensor-rank adaptation (LoTRA) for fine-tuning KANs.
We explore the application of LoTRA for efficiently solving various partial differential equations (PDEs) by fine-tuning KANs.
arXiv Detail & Related papers (2025-02-10T04:57:07Z) - AL-PINN: Active Learning-Driven Physics-Informed Neural Networks for Efficient Sample Selection in Solving Partial Differential Equations [0.0]
Physics-Informed Neural Networks (PINNs) have emerged as a promising approach for solving Partial Differential Equations (PDEs)
We propose Active Learning-Driven PINNs (AL-PINN), which integrates Uncertainty Quantification (UQ) and Active Learning strategies to optimize sample selection dynamically.
Our results demonstrate that AL-PINN achieves comparable or superior accuracy compared to traditional PINNs while reducing the number of required training samples.
arXiv Detail & Related papers (2025-02-06T10:54:28Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - DeltaPhi: Learning Physical Trajectory Residual for PDE Solving [54.13671100638092]
We propose and formulate the Physical Trajectory Residual Learning (DeltaPhi)
We learn the surrogate model for the residual operator mapping based on existing neural operator networks.
We conclude that, compared to direct learning, physical residual learning is preferred for PDE solving.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - VS-PINN: A fast and efficient training of physics-informed neural networks using variable-scaling methods for solving PDEs with stiff behavior [0.0]
We propose a new method for training PINNs using variable-scaling techniques.
We will demonstrate the effectiveness of the proposed method for these problems and confirm that it can significantly improve the training efficiency and performance of PINNs.
arXiv Detail & Related papers (2024-06-10T14:11:15Z) - Transport Equation based Physics Informed Neural Network to predict the
Yield Strength of Architected Materials [0.0]
The PINN model showcases exceptional generalization capabilities, indicating its capacity to avoid overfitting with the provided dataset.
The research underscores the importance of striking a balance between performance and computational efficiency while selecting an activation function for specific real-world applications.
arXiv Detail & Related papers (2023-07-29T12:42:03Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - A Physics Informed Neural Network Approach to Solution and
Identification of Biharmonic Equations of Elasticity [0.0]
We explore an application of the Physics Informed Neural Networks (PINNs) in conjunction with Airy stress functions and Fourier series.
We find that enriching feature space using Airy stress functions can significantly improve the accuracy of PINN solutions for biharmonic PDEs.
arXiv Detail & Related papers (2021-08-16T17:19:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.