Unsupervised Legendre-Galerkin Neural Network for Stiff Partial
Differential Equations
- URL: http://arxiv.org/abs/2207.10241v2
- Date: Fri, 22 Jul 2022 01:55:35 GMT
- Title: Unsupervised Legendre-Galerkin Neural Network for Stiff Partial
Differential Equations
- Authors: Junho Choi, Namjung Kim and Youngjoon Hong
- Abstract summary: We propose an unsupervised machine learning algorithm based on the Legendre-Galerkin neural network to find an accurate approximation to the solution of different types of PDEs.
The proposed neural network is applied to the general 1D and 2D PDEs as well as singularly perturbed PDEs that possess boundary layer behavior.
- Score: 9.659504024299896
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Machine learning methods have been lately used to solve differential
equations and dynamical systems. These approaches have been developed into a
novel research field known as scientific machine learning in which techniques
such as deep neural networks and statistical learning are applied to classical
problems of applied mathematics. Because neural networks provide an
approximation capability, computational parameterization through machine
learning and optimization methods achieve noticeable performance when solving
various partial differential equations (PDEs). In this paper, we develop a
novel numerical algorithm that incorporates machine learning and artificial
intelligence to solve PDEs. In particular, we propose an unsupervised machine
learning algorithm based on the Legendre-Galerkin neural network to find an
accurate approximation to the solution of different types of PDEs. The proposed
neural network is applied to the general 1D and 2D PDEs as well as singularly
perturbed PDEs that possess boundary layer behavior.
Related papers
- Solving Poisson Equations using Neural Walk-on-Spheres [80.1675792181381]
We propose Neural Walk-on-Spheres (NWoS), a novel neural PDE solver for the efficient solution of high-dimensional Poisson equations.
We demonstrate the superiority of NWoS in accuracy, speed, and computational costs.
arXiv Detail & Related papers (2024-06-05T17:59:22Z) - Newton Informed Neural Operator for Computing Multiple Solutions of Nonlinear Partials Differential Equations [3.8916312075738273]
We propose a novel approach called the Newton Informed Neural Operator to tackle nonlinearities.
Our method combines classical Newton methods, addressing well-posed problems, and efficiently learns multiple solutions in a single learning process.
arXiv Detail & Related papers (2024-05-23T01:52:54Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Deep Learning Methods for Partial Differential Equations and Related
Parameter Identification Problems [1.7150329136228712]
More and more neural network architectures have been developed to solve specific classes of partial differential equations (PDEs)
Such methods exploit properties that are inherent to PDEs and thus solve the PDEs better than standard feed-forward neural networks, recurrent neural networks, or convolutional neural networks.
This has had a great impact in the area of mathematical modeling where parametric PDEs are widely used to model most natural and physical processes arising in science and engineering.
arXiv Detail & Related papers (2022-12-06T16:53:34Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - A deep learning theory for neural networks grounded in physics [2.132096006921048]
We argue that building large, fast and efficient neural networks on neuromorphic architectures requires rethinking the algorithms to implement and train them.
Our framework applies to a very broad class of models, namely systems whose state or dynamics are described by variational equations.
arXiv Detail & Related papers (2021-03-18T02:12:48Z) - Partial Differential Equations is All You Need for Generating Neural Architectures -- A Theory for Physical Artificial Intelligence Systems [40.20472268839781]
We generalize the reaction-diffusion equation in statistical physics, Schr"odinger equation in quantum mechanics, Helmholtz equation in paraxial optics.
We take finite difference method to discretize NPDE for finding numerical solution.
Basic building blocks of deep neural network architecture, including multi-layer perceptron, convolutional neural network and recurrent neural networks, are generated.
arXiv Detail & Related papers (2021-03-10T00:05:46Z) - Neural-PDE: A RNN based neural network for solving time dependent PDEs [6.560798708375526]
Partial differential equations (PDEs) play a crucial role in studying a vast number of problems in science and engineering.
We propose a sequence deep learning framework called Neural-PDE, which allows to automatically learn governing rules of any time-dependent PDE system.
In our experiments the Neural-PDE can efficiently extract the dynamics within 20 epochs training, and produces accurate predictions.
arXiv Detail & Related papers (2020-09-08T15:46:00Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.