New Designed Loss Functions to Solve Ordinary Differential Equations
with Artificial Neural Network
- URL: http://arxiv.org/abs/2301.00636v1
- Date: Thu, 29 Dec 2022 11:26:31 GMT
- Title: New Designed Loss Functions to Solve Ordinary Differential Equations
with Artificial Neural Network
- Authors: Xiao Xiong
- Abstract summary: This paper investigates the use of artificial neural networks (ANNs) to solve differential equations (DEs)
In section 2, the loss function is generalized to $ntextth$ order ordinary differential equation(ODE)
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper investigates the use of artificial neural networks (ANNs) to solve
differential equations (DEs) and the construction of the loss function which
meets both differential equation and its initial/boundary condition of a
certain DE. In section 2, the loss function is generalized to $n^\text{th}$
order ordinary differential equation(ODE). Other methods of construction are
examined in Section 3 and applied to three different models to assess their
effectiveness.
Related papers
- deepFDEnet: A Novel Neural Network Architecture for Solving Fractional
Differential Equations [0.0]
In each fractional differential equation, a deep neural network is used to approximate the unknown function.
The results show that the proposed architecture solves different forms of fractional differential equations with excellent precision.
arXiv Detail & Related papers (2023-09-14T12:58:40Z) - Symbolic Recovery of Differential Equations: The Identifiability Problem [52.158782751264205]
Symbolic recovery of differential equations is the ambitious attempt at automating the derivation of governing equations.
We provide both necessary and sufficient conditions for a function to uniquely determine the corresponding differential equation.
We then use our results to devise numerical algorithms aiming to determine whether a function solves a differential equation uniquely.
arXiv Detail & Related papers (2022-10-15T17:32:49Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - DEQGAN: Learning the Loss Function for PINNs with Generative Adversarial
Networks [1.0499611180329804]
This work presents Differential Equation GAN (DEQGAN), a novel method for solving differential equations using generative adversarial networks.
We show that DEQGAN achieves multiple orders of magnitude lower mean squared errors than PINNs.
We also show that DEQGAN achieves solution accuracies that are competitive with popular numerical methods.
arXiv Detail & Related papers (2022-09-15T06:39:47Z) - Stochastic Scaling in Loss Functions for Physics-Informed Neural
Networks [0.0]
Trained neural networks act as universal function approximators, able to numerically solve differential equations in a novel way.
Variations on traditional loss function and training parameters show promise in making neural network-aided solutions more efficient.
arXiv Detail & Related papers (2022-08-07T17:12:39Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Learning To Solve Differential Equations Across Initial Conditions [12.66964917876272]
A number of neural network-based partial differential equation solvers have been formulated which provide performances equivalent, and in some cases even superior, to classical solvers.
In this work, we posit the problem of approximating the solution of a fixed partial differential equation for any arbitrary initial conditions as learning a conditional probability distribution.
arXiv Detail & Related papers (2020-03-26T21:29:22Z) - FiniteNet: A Fully Convolutional LSTM Network Architecture for
Time-Dependent Partial Differential Equations [0.0]
We use a fully convolutional LSTM network to exploit the dynamics of PDEs.
We show that our network can reduce error by a factor of 2 to 3 compared to the baseline algorithms.
arXiv Detail & Related papers (2020-02-07T21:18:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.