nPINNs: nonlocal Physics-Informed Neural Networks for a parametrized
nonlocal universal Laplacian operator. Algorithms and Applications
- URL: http://arxiv.org/abs/2004.04276v1
- Date: Wed, 8 Apr 2020 21:48:30 GMT
- Title: nPINNs: nonlocal Physics-Informed Neural Networks for a parametrized
nonlocal universal Laplacian operator. Algorithms and Applications
- Authors: Guofei Pang, Marta D'Elia, Michael Parks, George E. Karniadakis
- Abstract summary: Physics-informed neural networks (PINNs) are effective in solving inverse problems based on differential and integral equations with sparse, unstructured and multi-fidelity data.
In this paper, we extend PINNs to parameters and function for integral equations nonlocal Poisson and nonlocal turbulence.
Our results show that nPINNs can jointly infer this function as well as $delta$.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINNs) are effective in solving inverse
problems based on differential and integral equations with sparse, noisy,
unstructured, and multi-fidelity data. PINNs incorporate all available
information into a loss function, thus recasting the original problem into an
optimization problem. In this paper, we extend PINNs to parameter and function
inference for integral equations such as nonlocal Poisson and nonlocal
turbulence models, and we refer to them as nonlocal PINNs (nPINNs). The
contribution of the paper is three-fold. First, we propose a unified nonlocal
operator, which converges to the classical Laplacian as one of the operator
parameters, the nonlocal interaction radius $\delta$ goes to zero, and to the
fractional Laplacian as $\delta$ goes to infinity. This universal operator
forms a super-set of classical Laplacian and fractional Laplacian operators
and, thus, has the potential to fit a broad spectrum of data sets. We provide
theoretical convergence rates with respect to $\delta$ and verify them via
numerical experiments. Second, we use nPINNs to estimate the two parameters,
$\delta$ and $\alpha$. The strong non-convexity of the loss function yielding
multiple (good) local minima reveals the occurrence of the operator mimicking
phenomenon: different pairs of estimated parameters could produce multiple
solutions of comparable accuracy. Third, we propose another nonlocal operator
with spatially variable order $\alpha(y)$, which is more suitable for modeling
turbulent Couette flow. Our results show that nPINNs can jointly infer this
function as well as $\delta$. Also, these parameters exhibit a universal
behavior with respect to the Reynolds number, a finding that contributes to our
understanding of nonlocal interactions in wall-bounded turbulence.
Related papers
- On the estimation rate of Bayesian PINN for inverse problems [10.100602879566782]
Solving partial differential equations (PDEs) and their inverse problems using Physics-informed neural networks (PINNs) is a rapidly growing approach in the physics and machine learning community.
We study the behavior of a Bayesian PINN estimator of the solution of a PDE from $n$ independent noisy measurement of the solution.
arXiv Detail & Related papers (2024-06-21T01:13:18Z) - Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - MgNO: Efficient Parameterization of Linear Operators via Multigrid [4.096453902709292]
We introduce MgNO, utilizing multigrid structures to parameterize linear operators between neurons.
MgNO exhibits superior ease of training compared to other CNN-based models.
arXiv Detail & Related papers (2023-10-16T13:01:35Z) - Multi-Grid Tensorized Fourier Neural Operator for High-Resolution PDEs [93.82811501035569]
We introduce a new data efficient and highly parallelizable operator learning approach with reduced memory requirement and better generalization.
MG-TFNO scales to large resolutions by leveraging local and global structures of full-scale, real-world phenomena.
We demonstrate superior performance on the turbulent Navier-Stokes equations where we achieve less than half the error with over 150x compression.
arXiv Detail & Related papers (2023-09-29T20:18:52Z) - Learning Only On Boundaries: a Physics-Informed Neural operator for
Solving Parametric Partial Differential Equations in Complex Geometries [10.250994619846416]
We present a novel physics-informed neural operator method to solve parametrized boundary value problems without labeled data.
Our numerical experiments show the effectiveness of parametrized complex geometries and unbounded problems.
arXiv Detail & Related papers (2023-08-24T17:29:57Z) - Over-Parameterization Exponentially Slows Down Gradient Descent for
Learning a Single Neuron [49.45105570960104]
We prove the global convergence of randomly gradient descent with a $Oleft(T-3right)$ rate.
These two bounds jointly give an exact characterization of the convergence rate.
We show this potential function converges slowly, which implies the slow convergence rate of the loss function.
arXiv Detail & Related papers (2023-02-20T15:33:26Z) - Fourier Continuation for Exact Derivative Computation in
Physics-Informed Neural Operators [53.087564562565774]
PINO is a machine learning architecture that has shown promising empirical results for learning partial differential equations.
We present an architecture that leverages Fourier continuation (FC) to apply the exact gradient method to PINO for nonperiodic problems.
arXiv Detail & Related papers (2022-11-29T06:37:54Z) - $\Delta$-PINNs: physics-informed neural networks on complex geometries [2.1485350418225244]
Physics-informed neural networks (PINNs) have demonstrated promise in solving forward and inverse problems involving partial differential equations.
To date, there is no clear way to inform PINNs about the topology of the domain where the problem is being solved.
We propose a novel positional encoding mechanism for PINNs based on the eigenfunctions of the Laplace-Beltrami operator.
arXiv Detail & Related papers (2022-09-08T18:03:19Z) - Data-driven soliton mappings for integrable fractional nonlinear wave
equations via deep learning with Fourier neural operator [7.485410656333205]
We extend the Fourier neural operator (FNO) to discovery the soliton mapping between two function spaces.
To be specific, the fractional nonlinear Schr"odinger (fNLS), fractional Korteweg-de Vries (fKdV), fractional modified Korteweg-de Vries (fmKdV) and fractional sine-Gordon (fsineG) equations proposed recently are studied.
arXiv Detail & Related papers (2022-08-29T06:48:26Z) - A Law of Robustness beyond Isoperimetry [84.33752026418045]
We prove a Lipschitzness lower bound $Omega(sqrtn/p)$ of robustness of interpolating neural network parameters on arbitrary distributions.
We then show the potential benefit of overparametrization for smooth data when $n=mathrmpoly(d)$.
We disprove the potential existence of an $O(1)$-Lipschitz robust interpolating function when $n=exp(omega(d))$.
arXiv Detail & Related papers (2022-02-23T16:10:23Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.