Stochastic Scaling in Loss Functions for Physics-Informed Neural
Networks
- URL: http://arxiv.org/abs/2208.03776v1
- Date: Sun, 7 Aug 2022 17:12:39 GMT
- Title: Stochastic Scaling in Loss Functions for Physics-Informed Neural
Networks
- Authors: Ethan Mills, Alexey Pozdnyakov
- Abstract summary: Trained neural networks act as universal function approximators, able to numerically solve differential equations in a novel way.
Variations on traditional loss function and training parameters show promise in making neural network-aided solutions more efficient.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Differential equations are used in a wide variety of disciplines, describing
the complex behavior of the physical world. Analytic solutions to these
equations are often difficult to solve for, limiting our current ability to
solve complex differential equations and necessitating sophisticated numerical
methods to approximate solutions. Trained neural networks act as universal
function approximators, able to numerically solve differential equations in a
novel way. In this work, methods and applications of neural network algorithms
for numerically solving differential equations are explored, with an emphasis
on varying loss functions and biological applications. Variations on
traditional loss function and training parameters show promise in making neural
network-aided solutions more efficient, allowing for the investigation of more
complex equations governing biological principles.
Related papers
- NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Neuro-symbolic partial differential equation solver [0.0]
We present a strategy for developing mesh-free neuro-symbolic partial differential equation solvers from numerical discretizations found in scientific computing.
This strategy is unique in that it can be used to efficiently train neural network surrogate models for the solution functions and the differential operators.
arXiv Detail & Related papers (2022-10-25T22:56:43Z) - Symbolic Recovery of Differential Equations: The Identifiability Problem [52.158782751264205]
Symbolic recovery of differential equations is the ambitious attempt at automating the derivation of governing equations.
We provide both necessary and sufficient conditions for a function to uniquely determine the corresponding differential equation.
We then use our results to devise numerical algorithms aiming to determine whether a function solves a differential equation uniquely.
arXiv Detail & Related papers (2022-10-15T17:32:49Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - One-Shot Transfer Learning of Physics-Informed Neural Networks [2.6084034060847894]
We present a framework for transfer learning PINNs that results in one-shot inference for linear systems of both ordinary and partial differential equations.
This means that highly accurate solutions to many unknown differential equations can be obtained instantaneously without retraining an entire network.
arXiv Detail & Related papers (2021-10-21T17:14:58Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Neural Network Approximations of Compositional Functions With
Applications to Dynamical Systems [3.660098145214465]
We develop an approximation theory for compositional functions and their neural network approximations.
We identify a set of key features of compositional functions and the relationship between the features and the complexity of neural networks.
In addition to function approximations, we prove several formulae of error upper bounds for neural networks.
arXiv Detail & Related papers (2020-12-03T04:40:25Z) - Symbolically Solving Partial Differential Equations using Deep Learning [5.1964883240501605]
We describe a neural-based method for generating exact or approximate solutions to differential equations.
Unlike other neural methods, our system returns symbolic expressions that can be interpreted directly.
arXiv Detail & Related papers (2020-11-12T22:16:03Z) - A Neuro-Symbolic Method for Solving Differential and Functional
Equations [6.899578710832262]
We introduce a method for generating symbolic expressions to solve differential equations.
Unlike existing methods, our system does not require learning a language model over symbolic mathematics.
We show how the system can be effortlessly generalized to find symbolic solutions to other mathematical tasks.
arXiv Detail & Related papers (2020-11-04T17:13:25Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.