Fixed-budget online adaptive mesh learning for physics-informed neural
networks. Towards parameterized problem inference
- URL: http://arxiv.org/abs/2212.11776v1
- Date: Thu, 22 Dec 2022 15:12:29 GMT
- Title: Fixed-budget online adaptive mesh learning for physics-informed neural
networks. Towards parameterized problem inference
- Authors: Thi Nguyen Khoa Nguyen, Thibault Dairay, Rapha\"el Meunier, Christophe
Millet, Mathilde Mougeot
- Abstract summary: We propose a Fixed-Budget Online Adaptive Mesh Learning (FBOAML) method for training collocation points based on local maxima and local minima of the PDEs residuals.
FBOAML is able to identify the high-gradient location and even give better prediction for some physical fields than the classical PINNs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-Informed Neural Networks (PINNs) have gained much attention in
various fields of engineering thanks to their capability of incorporating
physical laws into the models. PINNs integrate the physical constraints by
minimizing the partial differential equations (PDEs) residuals on a set of
collocation points. The distribution of these collocation points appears to
have a huge impact on the performance of PINNs and the assessment of the
sampling methods for these points is still an active topic. In this paper, we
propose a Fixed-Budget Online Adaptive Mesh Learning (FBOAML) method, which
decomposes the domain into sub-domains, for training collocation points based
on local maxima and local minima of the PDEs residuals. The stopping criterion
is based on a data set of reference, which leads to an adaptive number of
iterations for each specific problem. The effectiveness of FBOAML is
demonstrated in the context of non-parameterized and parameterized problems.
The impact of the hyper-parameters in FBOAML is investigated in this work. The
comparison with other adaptive sampling methods is also illustrated. The
numerical results demonstrate important gains in terms of accuracy of PINNs
with FBOAML over the classical PINNs with non-adaptive collocation points. We
also apply FBOAML in a complex industrial application involving coupling
between mechanical and thermal fields. We show that FBOAML is able to identify
the high-gradient location and even give better prediction for some physical
fields than the classical PINNs with collocation points taken on a pre-adapted
finite element mesh.
Related papers
- RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - Reconfigurable Intelligent Surface (RIS)-Assisted Entanglement
Distribution in FSO Quantum Networks [62.87033427172205]
Quantum networks (QNs) relying on free-space optical (FSO) quantum channels can support quantum applications in environments where establishing an optical fiber infrastructure is challenging and costly.
A reconfigurable intelligent surface (RIS)-assisted FSO-based QN is proposed as a cost-efficient framework providing a virtual line-of-sight between users for entanglement distribution.
arXiv Detail & Related papers (2024-01-19T17:16:40Z) - Grad-Shafranov equilibria via data-free physics informed neural networks [0.0]
We show that PINNs can accurately and effectively solve the Grad-Shafranov equation with several different boundary conditions.
We introduce a parameterized PINN framework, expanding the input space to include variables such as pressure, aspect ratio, elongation, and triangularity.
arXiv Detail & Related papers (2023-11-22T16:08:38Z) - Optimization Guarantees of Unfolded ISTA and ADMM Networks With Smooth
Soft-Thresholding [57.71603937699949]
We study optimization guarantees, i.e., achieving near-zero training loss with the increase in the number of learning epochs.
We show that the threshold on the number of training samples increases with the increase in the network width.
arXiv Detail & Related papers (2023-09-12T13:03:47Z) - Adaptive Self-supervision Algorithms for Physics-informed Neural
Networks [59.822151945132525]
Physics-informed neural networks (PINNs) incorporate physical knowledge from the problem domain as a soft constraint on the loss function.
We study the impact of the location of the collocation points on the trainability of these models.
We propose a novel adaptive collocation scheme which progressively allocates more collocation points to areas where the model is making higher errors.
arXiv Detail & Related papers (2022-07-08T18:17:06Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Robust Learning of Physics Informed Neural Networks [2.86989372262348]
Physics-informed Neural Networks (PINNs) have been shown to be effective in solving partial differential equations.
This paper shows that a PINN can be sensitive to errors in training data and overfit itself in dynamically propagating these errors over the domain of the solution of the PDE.
arXiv Detail & Related papers (2021-10-26T00:10:57Z) - Optimal Transport Based Refinement of Physics-Informed Neural Networks [0.0]
We propose a refinement strategy to the well-known Physics-Informed Neural Networks (PINNs) for solving partial differential equations (PDEs) based on the concept of Optimal Transport (OT)
PINNs solvers have been found to suffer from a host of issues: spectral bias in fully-connected pathologies, unstable gradient, and difficulties with convergence and accuracy.
We present a novel training strategy for solving the Fokker-Planck-Kolmogorov Equation (FPKE) using OT-based sampling to supplement the existing PINNs framework.
arXiv Detail & Related papers (2021-05-26T02:51:20Z) - A hybrid MGA-MSGD ANN training approach for approximate solution of
linear elliptic PDEs [0.0]
We introduce a hybrid "Modified Genetic-Multilevel Gradient Descent" (MGA-MSGD) training algorithm.
It considerably improves accuracy and efficiency of solving 3D mechanical problems described, in strong-form, by PDEs via ANNs.
arXiv Detail & Related papers (2020-12-18T10:59:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.