Bias-Variance Trade-off in Physics-Informed Neural Networks with
Randomized Smoothing for High-Dimensional PDEs
- URL: http://arxiv.org/abs/2311.15283v1
- Date: Sun, 26 Nov 2023 12:50:28 GMT
- Title: Bias-Variance Trade-off in Physics-Informed Neural Networks with
Randomized Smoothing for High-Dimensional PDEs
- Authors: Zheyuan Hu, Zhouhao Yang, Yezhen Wang, George Em Karniadakis, Kenji
Kawaguchi
- Abstract summary: Physics-informed neural networks (PINNs) have been proven effective for low-dimensional partial differential equations (PDEs)
We present a comprehensive analysis of biases in RS-PINN, attributing them to the nonlinearity of the Mean Squared Error (MSE) loss and the PDE nonlinearity.
We propose tailored bias correction techniques based on the order of PDE nonlinearity.
- Score: 29.38656057181734
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: While physics-informed neural networks (PINNs) have been proven effective for
low-dimensional partial differential equations (PDEs), the computational cost
remains a hurdle in high-dimensional scenarios. This is particularly pronounced
when computing high-order and high-dimensional derivatives in the
physics-informed loss. Randomized Smoothing PINN (RS-PINN) introduces Gaussian
noise for stochastic smoothing of the original neural net model, enabling Monte
Carlo methods for derivative approximation, eliminating the need for costly
auto-differentiation. Despite its computational efficiency in high dimensions,
RS-PINN introduces biases in both loss and gradients, negatively impacting
convergence, especially when coupled with stochastic gradient descent (SGD). We
present a comprehensive analysis of biases in RS-PINN, attributing them to the
nonlinearity of the Mean Squared Error (MSE) loss and the PDE nonlinearity. We
propose tailored bias correction techniques based on the order of PDE
nonlinearity. The unbiased RS-PINN allows for a detailed examination of its
pros and cons compared to the biased version. Specifically, the biased version
has a lower variance and runs faster than the unbiased version, but it is less
accurate due to the bias. To optimize the bias-variance trade-off, we combine
the two approaches in a hybrid method that balances the rapid convergence of
the biased version with the high accuracy of the unbiased version. In addition,
we present an enhanced implementation of RS-PINN. Extensive experiments on
diverse high-dimensional PDEs, including Fokker-Planck, HJB, viscous Burgers',
Allen-Cahn, and Sine-Gordon equations, illustrate the bias-variance trade-off
and highlight the effectiveness of the hybrid RS-PINN. Empirical guidelines are
provided for selecting biased, unbiased, or hybrid versions, depending on the
dimensionality and nonlinearity of the specific PDE problem.
Related papers
- PIG: Physics-Informed Gaussians as Adaptive Parametric Mesh Representations [5.4087282763977855]
The approximation of Partial Differential Equations (PDEs) using neural networks has seen significant advancements.
PINNs often suffer from limited accuracy due to the spectral bias of Multi-Layer Perceptrons (MLPs), which struggle to learn high-frequency and non-linear components.
We propose Physics-Informed Gaussians (PIGs), which combine feature embeddings using Gaussian functions with a lightweight neural network.
arXiv Detail & Related papers (2024-12-08T16:58:29Z) - RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - An Extreme Learning Machine-Based Method for Computational PDEs in
Higher Dimensions [1.2981626828414923]
We present two effective methods for solving high-dimensional partial differential equations (PDE) based on randomized neural networks.
We present ample numerical simulations for a number of high-dimensional linear/nonlinear stationary/dynamic PDEs to demonstrate their performance.
arXiv Detail & Related papers (2023-09-13T15:59:02Z) - PDE-Refiner: Achieving Accurate Long Rollouts with Neural PDE Solvers [40.097474800631]
Time-dependent partial differential equations (PDEs) are ubiquitous in science and engineering.
Deep neural network based surrogates have gained increased interest.
arXiv Detail & Related papers (2023-08-10T17:53:05Z) - PDE+: Enhancing Generalization via PDE with Adaptive Distributional
Diffusion [66.95761172711073]
generalization of neural networks is a central challenge in machine learning.
We propose to enhance it directly through the underlying function of neural networks, rather than focusing on adjusting input data.
We put this theoretical framework into practice as $textbfPDE+$ ($textbfPDE$ with $textbfA$daptive $textbfD$istributional $textbfD$iffusion)
arXiv Detail & Related papers (2023-05-25T08:23:26Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Parsimonious Physics-Informed Random Projection Neural Networks for
Initial-Value Problems of ODEs and index-1 DAEs [0.0]
We address a physics-informed neural network based on random projections for the numerical solution of IVPs of nonlinear ODEs in linear-implicit form and index-1 DAEs.
Based on previous works on random projections, we prove the approximation capability of the scheme for ODEs in the canonical form and index-1 DAEs in the semiexplicit form.
arXiv Detail & Related papers (2022-03-10T12:34:46Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Implicit Bias of MSE Gradient Optimization in Underparameterized Neural
Networks [0.0]
We study the dynamics of a neural network in function space when optimizing the mean squared error via gradient flow.
We show that the network learns eigenfunctions of an integral operator $T_Kinfty$ determined by the Neural Tangent Kernel (NTK)
We conclude that damped deviations offers a simple and unifying perspective of the dynamics when optimizing the squared error.
arXiv Detail & Related papers (2022-01-12T23:28:41Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.