Leveraging Influence Functions for Resampling Data in Physics-Informed Neural Networks
- URL: http://arxiv.org/abs/2506.16443v1
- Date: Thu, 19 Jun 2025 16:21:14 GMT
- Title: Leveraging Influence Functions for Resampling Data in Physics-Informed Neural Networks
- Authors: Jonas R. Naujoks, Aleksander Krasowski, Moritz Weckbecker, Galip Ümit Yolcu, Thomas Wiegand, Sebastian Lapuschkin, Wojciech Samek, René P. Klausen,
- Abstract summary: Physics-informed neural networks (PINNs) offer a powerful approach to solving partial differential equations (PDEs)<n>PINNs have recently emerged as a valuable tool in the field of scientific machine learning.
- Score: 45.752780408098765
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Physics-informed neural networks (PINNs) offer a powerful approach to solving partial differential equations (PDEs), which are ubiquitous in the quantitative sciences. Applied to both forward and inverse problems across various scientific domains, PINNs have recently emerged as a valuable tool in the field of scientific machine learning. A key aspect of their training is that the data -- spatio-temporal points sampled from the PDE's input domain -- are readily available. Influence functions, a tool from the field of explainable AI (XAI), approximate the effect of individual training points on the model, enhancing interpretability. In the present work, we explore the application of influence function-based sampling approaches for the training data. Our results indicate that such targeted resampling based on data attribution methods has the potential to enhance prediction accuracy in physics-informed neural networks, demonstrating a practical application of an XAI method in PINN training.
Related papers
- Improving physics-informed neural network extrapolation via transfer learning and adaptive activation functions [44.44497277876625]
Physics-Informed Neural Networks (PINNs) are deep learning models that incorporate the governing physical laws of a system into the learning process.<n>We introduce a transfer learning (TL) method to improve the extrapolation capability of PINNs.<n>We demonstrate that our method achieves an average of 40% reduction in relative L2 error and an average of 50% reduction in mean absolute error.
arXiv Detail & Related papers (2025-07-16T22:19:53Z) - Adaptive Physics-informed Neural Networks: A Survey [15.350973327319418]
Physics-informed neural networks (PINNs) have emerged as a promising approach to solving partial differential equations.<n>This survey reviews existing research that addresses limitations through transfer learning and meta-learning.
arXiv Detail & Related papers (2025-03-23T19:33:05Z) - AL-PINN: Active Learning-Driven Physics-Informed Neural Networks for Efficient Sample Selection in Solving Partial Differential Equations [0.0]
Physics-Informed Neural Networks (PINNs) have emerged as a promising approach for solving Partial Differential Equations (PDEs)<n>We propose Active Learning-Driven PINNs (AL-PINN), which integrates Uncertainty Quantification (UQ) and Active Learning strategies to optimize sample selection dynamically.<n>Our results demonstrate that AL-PINN achieves comparable or superior accuracy compared to traditional PINNs while reducing the number of required training samples.
arXiv Detail & Related papers (2025-02-06T10:54:28Z) - Learnable Activation Functions in Physics-Informed Neural Networks for Solving Partial Differential Equations [0.0]
Physics-Informed Neural Networks (PINNs) have emerged as a promising approach for solving Partial Differential Equations (PDEs)<n>These limitations impact their accuracy for problems involving rapid oscillations, sharp gradients, and complex boundary behaviors.<n>We investigate learnable activation functions as a solution to these challenges.
arXiv Detail & Related papers (2024-11-22T18:25:13Z) - PINNfluence: Influence Functions for Physics-Informed Neural Networks [47.27512105490682]
Physics-informed neural networks (PINNs) have emerged as a flexible and promising application of deep learning to partial differential equations in the physical sciences.<n>We explore the application of influence functions (IFs) to validate and debug PINNs post-hoc.
arXiv Detail & Related papers (2024-09-13T16:23:17Z) - DeltaPhi: Learning Physical Trajectory Residual for PDE Solving [54.13671100638092]
We propose and formulate the Physical Trajectory Residual Learning (DeltaPhi)
We learn the surrogate model for the residual operator mapping based on existing neural operator networks.
We conclude that, compared to direct learning, physical residual learning is preferred for PDE solving.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - Efficient and Flexible Neural Network Training through Layer-wise Feedback Propagation [49.44309457870649]
Layer-wise Feedback feedback (LFP) is a novel training principle for neural network-like predictors.<n>LFP decomposes a reward to individual neurons based on their respective contributions.<n>Our method then implements a greedy reinforcing approach helpful parts of the network and weakening harmful ones.
arXiv Detail & Related papers (2023-08-23T10:48:28Z) - Transport Equation based Physics Informed Neural Network to predict the
Yield Strength of Architected Materials [0.0]
The PINN model showcases exceptional generalization capabilities, indicating its capacity to avoid overfitting with the provided dataset.
The research underscores the importance of striking a balance between performance and computational efficiency while selecting an activation function for specific real-world applications.
arXiv Detail & Related papers (2023-07-29T12:42:03Z) - Understanding and Mitigating Extrapolation Failures in Physics-Informed
Neural Networks [1.1510009152620668]
We study the extrapolation behavior of PINNs on a representative set of PDEs of different types.
We find that failure to extrapolate is not caused by high frequencies in the solution function, but rather by shifts in the support of the Fourier spectrum over time.
arXiv Detail & Related papers (2023-06-15T20:08:42Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - On the Generalization of PINNs outside the training domain and the
Hyperparameters influencing it [1.3927943269211593]
PINNs are Neural Network architectures trained to emulate solutions of differential equations without the necessity of solution data.
We perform an empirical analysis of the behavior of PINN predictions outside their training domain.
We assess whether the algorithmic setup of PINNs can influence their potential for generalization and showcase the respective effect on the prediction.
arXiv Detail & Related papers (2023-02-15T09:51:56Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.