About optimal loss function for training physics-informed neural
networks under respecting causality
- URL: http://arxiv.org/abs/2304.02282v1
- Date: Wed, 5 Apr 2023 08:10:40 GMT
- Title: About optimal loss function for training physics-informed neural
networks under respecting causality
- Authors: Vasiliy A. Es'kin, Danil V. Davydov, Ekaterina D. Egorova, Alexey O.
Malkhanov, Mikhail A. Akhukov, Mikhail E. Smorkalov
- Abstract summary: The advantage of using the modified problem for physics-informed neural networks (PINNs) methodology is that it becomes possible to represent the loss function in the form of a single term associated with differential equations.
Numerical experiments have been carried out for a number of problems, demonstrating the accuracy of the proposed methods.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A method is presented that allows to reduce a problem described by
differential equations with initial and boundary conditions to the problem
described only by differential equations. The advantage of using the modified
problem for physics-informed neural networks (PINNs) methodology is that it
becomes possible to represent the loss function in the form of a single term
associated with differential equations, thus eliminating the need to tune the
scaling coefficients for the terms related to boundary and initial conditions.
The weighted loss functions respecting causality were modified and new weighted
loss functions based on generalized functions are derived. Numerical
experiments have been carried out for a number of problems, demonstrating the
accuracy of the proposed methods.
Related papers
- Transformed Physics-Informed Neural Networks for The Convection-Diffusion Equation [0.0]
Singularly perturbed problems have solutions with steep boundary layers that are hard to resolve numerically.
Traditional numerical methods, such as Finite Difference Methods, require a refined mesh to obtain stable and accurate solutions.
We consider the use of Physics-Informed Neural Networks (PINNs) to produce numerical solutions of singularly perturbed problems.
arXiv Detail & Related papers (2024-09-12T00:24:21Z) - A Physics-driven GraphSAGE Method for Physical Process Simulations
Described by Partial Differential Equations [2.1217718037013635]
A physics-driven GraphSAGE approach is presented to solve problems governed by irregular PDEs.
A distance-related edge feature and a feature mapping strategy are devised to help training and convergence.
The robust PDE surrogate model for heat conduction problems parameterized by the Gaussian singularity random field source is successfully established.
arXiv Detail & Related papers (2024-03-13T14:25:15Z) - On the Dynamics Under the Unhinged Loss and Beyond [104.49565602940699]
We introduce the unhinged loss, a concise loss function, that offers more mathematical opportunities to analyze closed-form dynamics.
The unhinged loss allows for considering more practical techniques, such as time-vary learning rates and feature normalization.
arXiv Detail & Related papers (2023-12-13T02:11:07Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Symbolic Recovery of Differential Equations: The Identifiability Problem [52.158782751264205]
Symbolic recovery of differential equations is the ambitious attempt at automating the derivation of governing equations.
We provide both necessary and sufficient conditions for a function to uniquely determine the corresponding differential equation.
We then use our results to devise numerical algorithms aiming to determine whether a function solves a differential equation uniquely.
arXiv Detail & Related papers (2022-10-15T17:32:49Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Certified machine learning: Rigorous a posteriori error bounds for PDE
defined PINNs [0.0]
We present a rigorous upper bound on the prediction error of physics-informed neural networks.
We apply this to four problems: the transport equation, the heat equation, the Navier-Stokes equation and the Klein-Gordon equation.
arXiv Detail & Related papers (2022-10-07T09:49:18Z) - Stochastic Scaling in Loss Functions for Physics-Informed Neural
Networks [0.0]
Trained neural networks act as universal function approximators, able to numerically solve differential equations in a novel way.
Variations on traditional loss function and training parameters show promise in making neural network-aided solutions more efficient.
arXiv Detail & Related papers (2022-08-07T17:12:39Z) - Enhanced Physics-Informed Neural Networks with Augmented Lagrangian
Relaxation Method (AL-PINNs) [1.7403133838762446]
Physics-Informed Neural Networks (PINNs) are powerful approximators of solutions to nonlinear partial differential equations (PDEs)
We propose an Augmented Lagrangian relaxation method for PINNs (AL-PINNs)
We demonstrate through various numerical experiments that AL-PINNs yield a much smaller relative error compared with that of state-of-the-art adaptive loss-balancing algorithms.
arXiv Detail & Related papers (2022-04-29T08:33:11Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.