Unsupervised Learning of Solutions to Differential Equations with
Generative Adversarial Networks
- URL: http://arxiv.org/abs/2007.11133v1
- Date: Tue, 21 Jul 2020 23:36:36 GMT
- Title: Unsupervised Learning of Solutions to Differential Equations with
Generative Adversarial Networks
- Authors: Dylan Randle, Pavlos Protopapas, David Sondak
- Abstract summary: We develop a novel method for solving differential equations with unsupervised neural networks.
We show that our method, which we call Differential Equation GAN (DEQGAN), can obtain multiple orders of magnitude lower mean squared errors.
- Score: 1.1470070927586016
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Solutions to differential equations are of significant scientific and
engineering relevance. Recently, there has been a growing interest in solving
differential equations with neural networks. This work develops a novel method
for solving differential equations with unsupervised neural networks that
applies Generative Adversarial Networks (GANs) to \emph{learn the loss
function} for optimizing the neural network. We present empirical results
showing that our method, which we call Differential Equation GAN (DEQGAN), can
obtain multiple orders of magnitude lower mean squared errors than an
alternative unsupervised neural network method based on (squared) $L_2$, $L_1$,
and Huber loss functions. Moreover, we show that DEQGAN achieves solution
accuracy that is competitive with traditional numerical methods. Finally, we
analyze the stability of our approach and find it to be sensitive to the
selection of hyperparameters, which we provide in the appendix.
Code available at https://github.com/dylanrandle/denn. Please address any
electronic correspondence to dylanrandle@alumni.harvard.edu.
Related papers
- LinSATNet: The Positive Linear Satisfiability Neural Networks [116.65291739666303]
This paper studies how to introduce the popular positive linear satisfiability to neural networks.
We propose the first differentiable satisfiability layer based on an extension of the classic Sinkhorn algorithm for jointly encoding multiple sets of marginal distributions.
arXiv Detail & Related papers (2024-07-18T22:05:21Z) - Chebyshev Spectral Neural Networks for Solving Partial Differential Equations [0.0]
The study uses a feedforward neural network model and error backpropagation principles, utilizing automatic differentiation (AD) to compute the loss function.
The numerical efficiency and accuracy of the CSNN model are investigated through testing on elliptic partial differential equations, and it is compared with the well-known Physics-Informed Neural Network(PINN) method.
arXiv Detail & Related papers (2024-06-06T05:31:45Z) - DEQGAN: Learning the Loss Function for PINNs with Generative Adversarial
Networks [1.0499611180329804]
This work presents Differential Equation GAN (DEQGAN), a novel method for solving differential equations using generative adversarial networks.
We show that DEQGAN achieves multiple orders of magnitude lower mean squared errors than PINNs.
We also show that DEQGAN achieves solution accuracies that are competitive with popular numerical methods.
arXiv Detail & Related papers (2022-09-15T06:39:47Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Meta-Solver for Neural Ordinary Differential Equations [77.8918415523446]
We investigate how the variability in solvers' space can improve neural ODEs performance.
We show that the right choice of solver parameterization can significantly affect neural ODEs models in terms of robustness to adversarial attacks.
arXiv Detail & Related papers (2021-03-15T17:26:34Z) - Computational characteristics of feedforward neural networks for solving
a stiff differential equation [0.0]
We study the solution of a simple but fundamental stiff ordinary differential equation modelling a damped system.
We show that it is possible to identify preferable choices to be made for parameters and methods.
Overall we extend the current literature in the field by showing what can be done in order to obtain reliable and accurate results by the neural network approach.
arXiv Detail & Related papers (2020-12-03T12:22:24Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - ODEN: A Framework to Solve Ordinary Differential Equations using
Artificial Neural Networks [0.0]
We prove a specific loss function, which does not require knowledge of the exact solution, to evaluate neural networks' performance.
Neural networks are shown to be proficient at approximating continuous solutions within their training domains.
A user-friendly and adaptable open-source code (ODE$mathcalN$) is provided on GitHub.
arXiv Detail & Related papers (2020-05-28T15:34:10Z) - A Derivative-Free Method for Solving Elliptic Partial Differential
Equations with Deep Neural Networks [2.578242050187029]
We introduce a deep neural network based method for solving a class of elliptic partial differential equations.
We approximate the solution of the PDE with a deep neural network which is trained under the guidance of a probabilistic representation of the PDE.
As Brownian walkers explore the domain, the deep neural network is iteratively trained using a form of reinforcement learning.
arXiv Detail & Related papers (2020-01-17T03:29:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.