MC-Nonlocal-PINNs: handling nonlocal operators in PINNs via Monte Carlo
sampling
- URL: http://arxiv.org/abs/2212.12984v1
- Date: Mon, 26 Dec 2022 01:58:14 GMT
- Title: MC-Nonlocal-PINNs: handling nonlocal operators in PINNs via Monte Carlo
sampling
- Authors: Xiaodong Feng and Yue Qian and Wanfang Shen
- Abstract summary: We propose, Monte Carlo Nonlocal physics-informed neural networks (MC-Nonlocal-PINNs)
MC-Nonlocal-PINNs handle the nonlocal operators in a Monte Carlo way, resulting in a very stable approach for high dimensional problems.
We present a variety of test problems, including high dimensional Volterra type integral equations, hypersingular integral equations and nonlocal PDEs.
- Score: 3.97478982737167
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose, Monte Carlo Nonlocal physics-informed neural networks
(MC-Nonlocal-PINNs), which is a generalization of MC-fPINNs in
\cite{guo2022monte}, for solving general nonlocal models such as integral
equations and nonlocal PDEs. Similar as in MC-fPINNs, our MC-Nonlocal-PINNs
handle the nonlocal operators in a Monte Carlo way, resulting in a very stable
approach for high dimensional problems. We present a variety of test problems,
including high dimensional Volterra type integral equations, hypersingular
integral equations and nonlocal PDEs, to demonstrate the effectiveness of our
approach.
Related papers
- Improvement of Bayesian PINN Training Convergence in Solving Multi-scale PDEs with Noise [34.11898314129823]
In practice, Hamiltonian Monte Carlo (HMC) used to estimate the internal parameters of BPINN often encounters troubles.
We develop a robust multi-scale Bayesian PINN (dubbed MBPINN) method by integrating multi-scale neural networks (MscaleDNN) and Bayesian inference.
Our findings indicate that the proposed method can avoid HMC failures and provide valid results.
arXiv Detail & Related papers (2024-08-18T03:20:16Z) - Randomized Physics-Informed Neural Networks for Bayesian Data Assimilation [44.99833362998488]
We propose a randomized physics-informed neural network (PINN) or rPINN method for uncertainty quantification in inverse partial differential equation (PDE) problems with noisy data.
For the linear Poisson equation, HMC and rPINN produce similar distributions, but rPINN is on average 27 times faster than HMC.
For the non-linear Poison and diffusion equations, the HMC method fails to converge because a single HMC chain cannot sample multiple modes of the posterior distribution of the PINN parameters in a reasonable amount of time.
arXiv Detail & Related papers (2024-07-05T16:16:47Z) - Tackling the Curse of Dimensionality in Fractional and Tempered Fractional PDEs with Physics-Informed Neural Networks [24.86574584293979]
Physics-informed neural networks (PINNs) offer a promising solution due to their universal approximation, ability generalization, and mesh-free training.
We extend MC-fPINN to tempered fractional PDEs to address these issues, resulting in the Monte Carlo tempered fractional PINN (MC-tfPINN)
We validate our methods on various forward and inverse problems of fractional and tempered fractional PDEs, scaling up to 100,000 dimensions.
arXiv Detail & Related papers (2024-06-17T16:26:18Z) - RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - GMC-PINNs: A new general Monte Carlo PINNs method for solving fractional partial differential equations on irregular domains [4.051523221722475]
We propose a new general (quasi) Monte Carlo PINN for solving fPDEs on irregular domains.
We use a more general Monte Carlo approximation method to solve different fPDEs, which is valid for fractional differentiation under any definition.
Our results demonstrate the effectiveness of GMC-PINNs in dealing with irregular domain problems and show a higher computational efficiency compared to the original fPINN method.
arXiv Detail & Related papers (2024-04-30T21:52:15Z) - Adversarial Training for Physics-Informed Neural Networks [4.446564162927513]
We propose an adversarial training strategy for PINNs termed by AT-PINNs.
AT-PINNs enhance the robustness of PINNs by fine-tuning the model with adversarial samples.
We implement AT-PINNs to the elliptic equation with multi-scale coefficients, Poisson equation with multi-peak solutions, Burgers equation with sharp solutions and the Allen-Cahn equation.
arXiv Detail & Related papers (2023-10-18T08:28:43Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Monte Carlo PINNs: deep learning approach for forward and inverse
problems involving high dimensional fractional partial differential equations [8.378422134042722]
We introduce a sampling based machine learning approach, Monte Carlo physics informed neural networks (MC-PINNs) for solving forward and inverse fractional partial differential equations (FPDEs)
As a generalization of physics informed neural networks (PINNs), our method relies on deep neural network surrogates in addition to an approximation strategy for computing the fractional derivatives of the outputs.
We validate the performance of MC-PINNs via several examples that include high dimensional integral fractional Laplacian equations, parametric identification of time-space fractional PDEs, and fractional diffusion equation with random inputs.
arXiv Detail & Related papers (2022-03-16T09:52:05Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.