Can Transfer Neuroevolution Tractably Solve Your Differential Equations?
- URL: http://arxiv.org/abs/2101.01998v2
- Date: Wed, 5 May 2021 02:37:12 GMT
- Title: Can Transfer Neuroevolution Tractably Solve Your Differential Equations?
- Authors: Jian Cheng Wong, Abhishek Gupta, Yew-Soon Ong
- Abstract summary: This paper introduces neuroevolution for solving differential equations.
Neuroevolution carries out a parallel exploration of diverse solutions with the goal of circumventing local optima.
A novel and computationally efficient transfer neuroevolution algorithm is proposed in this paper.
- Score: 22.714772862513826
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper introduces neuroevolution for solving differential equations. The
solution is obtained through optimizing a deep neural network whose loss
function is defined by the residual terms from the differential equations.
Recent studies have focused on learning such physics-informed neural networks
through stochastic gradient descent (SGD) variants, yet they face the
difficulty of obtaining an accurate solution due to optimization challenges. In
the context of solving differential equations, we are faced with the problem of
finding globally optimum parameters of the network, instead of being concerned
with out-of-sample generalization. SGD, which searches along a single gradient
direction, is prone to become trapped in local optima, so it may not be the
best approach here. In contrast, neuroevolution carries out a parallel
exploration of diverse solutions with the goal of circumventing local optima.
It could potentially find more accurate solutions with better optimized neural
networks. However, neuroevolution can be slow, raising tractability issues in
practice. With that in mind, a novel and computationally efficient transfer
neuroevolution algorithm is proposed in this paper. Our method is capable of
exploiting relevant experiential priors when solving a new problem, with
adaptation to protect against the risk of negative transfer. The algorithm is
applied on a variety of differential equations to empirically demonstrate that
transfer neuroevolution can indeed achieve better accuracy and faster
convergence than SGD. The experimental outcomes thus establish transfer
neuroevolution as a noteworthy approach for solving differential equations, one
that has never been studied in the past. Our work expands the resource of
available algorithms for optimizing physics-informed neural networks.
Related papers
- The Unreasonable Effectiveness of Solving Inverse Problems with Neural Networks [24.766470360665647]
We show that neural networks trained to learn solutions to inverse problems can find better solutions than classicals even on their training set.
Our findings suggest an alternative use for neural networks: rather than generalizing to new data for fast inference, they can also be used to find better solutions on known data.
arXiv Detail & Related papers (2024-08-15T12:38:10Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Neuroevolution of Physics-Informed Neural Nets: Benchmark Problems and
Comparative Results [25.12291688711645]
Physics-informed neural networks (PINNs) are one of the key techniques at the forefront of recent advances.
PINNs' unique loss formulations lead to a high degree of complexity and ruggedness that may not be conducive for gradient descent.
Neuroevolution algorithms, with their superior global search capacity, may be a better choice for PINNs.
arXiv Detail & Related papers (2022-12-15T05:54:16Z) - Adaptive neural domain refinement for solving time-dependent
differential equations [0.0]
A classic approach for solving differential equations with neural networks builds upon neural forms, which employ the differential equation with a discretisation of the solution domain.
It would be desirable to transfer such important and successful strategies to the field of neural network based solutions.
We propose a novel adaptive neural approach to meet this aim for solving time-dependent problems.
arXiv Detail & Related papers (2021-12-23T13:19:07Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Non-Gradient Manifold Neural Network [79.44066256794187]
Deep neural network (DNN) generally takes thousands of iterations to optimize via gradient descent.
We propose a novel manifold neural network based on non-gradient optimization.
arXiv Detail & Related papers (2021-06-15T06:39:13Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Meta-Solver for Neural Ordinary Differential Equations [77.8918415523446]
We investigate how the variability in solvers' space can improve neural ODEs performance.
We show that the right choice of solver parameterization can significantly affect neural ODEs models in terms of robustness to adversarial attacks.
arXiv Detail & Related papers (2021-03-15T17:26:34Z) - Unsupervised Learning of Solutions to Differential Equations with
Generative Adversarial Networks [1.1470070927586016]
We develop a novel method for solving differential equations with unsupervised neural networks.
We show that our method, which we call Differential Equation GAN (DEQGAN), can obtain multiple orders of magnitude lower mean squared errors.
arXiv Detail & Related papers (2020-07-21T23:36:36Z) - ODEN: A Framework to Solve Ordinary Differential Equations using
Artificial Neural Networks [0.0]
We prove a specific loss function, which does not require knowledge of the exact solution, to evaluate neural networks' performance.
Neural networks are shown to be proficient at approximating continuous solutions within their training domains.
A user-friendly and adaptable open-source code (ODE$mathcalN$) is provided on GitHub.
arXiv Detail & Related papers (2020-05-28T15:34:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.