On NeuroSymbolic Solutions for PDEs
- URL: http://arxiv.org/abs/2207.06240v1
- Date: Mon, 11 Jul 2022 16:04:20 GMT
- Title: On NeuroSymbolic Solutions for PDEs
- Authors: Ritam Majumdar, Vishal Jadhav, Anirudh Deodhar, Shirish Karande,
Lovekesh Vig
- Abstract summary: Physics Informed Neural Networks (PINNs) have gained immense popularity as an alternate method for numerically solving PDEs.
In this work we explore a NeuroSymbolic approach to approximate the solution for PDEs.
We show that the NeuroSymbolic approximations are consistently 1-2 order of magnitude better than just the neural or symbolic approximations.
- Score: 12.968529838140036
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Physics Informed Neural Networks (PINNs) have gained immense popularity as an
alternate method for numerically solving PDEs. Despite their empirical success
we are still building an understanding of the convergence properties of
training on such constraints with gradient descent. It is known that, in the
absence of an explicit inductive bias, Neural Networks can struggle to learn or
approximate even simple and well known functions in a sample efficient manner.
Thus the numerical approximation induced from few collocation points may not
generalize over the entire domain. Meanwhile, a symbolic form can exhibit good
generalization, with interpretability as a useful byproduct. However, symbolic
approximations can struggle to simultaneously be concise and accurate.
Therefore in this work we explore a NeuroSymbolic approach to approximate the
solution for PDEs. We observe that our approach work for several simple cases.
We illustrate the efficacy of our approach on Navier Stokes: Kovasznay flow
where there are multiple physical quantities of interest governed with
non-linear coupled PDE system. Domain splitting is now becoming a popular trick
to help PINNs approximate complex functions. We observe that a NeuroSymbolic
approach can help such complex functions as well. We demonstrate
Domain-splitting assisted NeuroSymbolic approach on a temporally varying
two-dimensional Burger's equation. Finally we consider the scenario where PINNs
have to be solved for parameterized PDEs, for changing Initial-Boundary
Conditions and changes in the coefficient of the PDEs. Hypernetworks have shown
to hold promise to overcome these challenges. We show that one can design
Hyper-NeuroSymbolic Networks which can combine the benefits of speed and
increased accuracy. We observe that that the NeuroSymbolic approximations are
consistently 1-2 order of magnitude better than just the neural or symbolic
approximations.
Related papers
- Learning Traveling Solitary Waves Using Separable Gaussian Neural
Networks [0.9065034043031668]
We apply a machine-learning approach to learn traveling solitary waves across various families of partial differential equations (PDEs)
Our approach integrates a novel interpretable neural network (NN) architecture into the framework of Physics-Informed Neural Networks (PINNs)
arXiv Detail & Related papers (2024-03-07T20:16:18Z) - Correctness Verification of Neural Networks Approximating Differential
Equations [0.0]
Neural Networks (NNs) approximate the solution of Partial Differential Equations (PDEs)
NNs can become integral parts of simulation software tools which can accelerate the simulation of complex dynamic systems more than 100 times.
This work addresses the verification of these functions by defining the NN derivative as a finite difference approximation.
For the first time, we tackle the problem of bounding an NN function without a priori knowledge of the output domain.
arXiv Detail & Related papers (2024-02-12T12:55:35Z) - Global Convergence of Deep Galerkin and PINNs Methods for Solving
Partial Differential Equations [0.0]
A variety of deep learning methods have been developed to try and solve high-dimensional PDEs by approximating the solution using a neural network.
We prove global convergence for one of the commonly-used deep learning algorithms for solving PDEs, the Deep Galerkin MethodDGM.
arXiv Detail & Related papers (2023-05-10T09:20:11Z) - iPINNs: Incremental learning for Physics-informed neural networks [66.4795381419701]
Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs)
We propose incremental PINNs that can learn multiple tasks sequentially without additional parameters for new tasks and improve performance for every equation in the sequence.
Our approach learns multiple PDEs starting from the simplest one by creating its own subnetwork for each PDE and allowing each subnetwork to overlap with previously learnedworks.
arXiv Detail & Related papers (2023-04-10T20:19:20Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Spline-PINN: Approaching PDEs without Data using Fast, Physics-Informed
Hermite-Spline CNNs [4.560331122656578]
Partial Differential Equations (PDEs) are notoriously difficult to solve.
In this paper, we propose to approach the solution of PDEs based on a novel technique that combines the advantages of two recently emerging machine learning based approaches.
arXiv Detail & Related papers (2021-09-15T08:10:23Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - On the convergence of physics informed neural networks for linear
second-order elliptic and parabolic type PDEs [0.0]
Physics informed neural networks (PINNs) are deep learning based techniques for solving partial differential equations (PDEs)
We show that the sequence of minimizers strongly converges to the PDE solution in $C0$.
To the best of our knowledge, this is the first theoretical work that shows the consistency of PINNs.
arXiv Detail & Related papers (2020-04-03T22:59:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.