Integral Transforms in a Physics-Informed (Quantum) Neural Network
setting: Applications & Use-Cases
- URL: http://arxiv.org/abs/2206.14184v1
- Date: Tue, 28 Jun 2022 17:51:32 GMT
- Title: Integral Transforms in a Physics-Informed (Quantum) Neural Network
setting: Applications & Use-Cases
- Authors: Niraj Kumar, Evan Philip, Vincent E. Elfving
- Abstract summary: In many computational problems in engineering and science, function or model differentiation is essential, but also integration is needed.
In this work, we propose to augment the paradigm of Physics-Informed Neural Networks with automatic integration.
- Score: 1.7403133838762446
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In many computational problems in engineering and science, function or model
differentiation is essential, but also integration is needed. An important
class of computational problems include so-called integro-differential
equations which include both integrals and derivatives of a function. In
another example, stochastic differential equations can be written in terms of a
partial differential equation of a probability density function of the
stochastic variable. To learn characteristics of the stochastic variable based
on the density function, specific integral transforms, namely moments, of the
density function need to be calculated. Recently, the machine learning paradigm
of Physics-Informed Neural Networks emerged with increasing popularity as a
method to solve differential equations by leveraging automatic differentiation.
In this work, we propose to augment the paradigm of Physics-Informed Neural
Networks with automatic integration in order to compute complex integral
transforms on trained solutions, and to solve integro-differential equations
where integrals are computed on-the-fly during training. Furthermore, we
showcase the techniques in various application settings, numerically simulating
quantum computer-based neural networks as well as classical neural networks.
Related papers
- Neural Control Variates with Automatic Integration [49.91408797261987]
This paper proposes a novel approach to construct learnable parametric control variates functions from arbitrary neural network architectures.
We use the network to approximate the anti-derivative of the integrand.
We apply our method to solve partial differential equations using the Walk-on-sphere algorithm.
arXiv Detail & Related papers (2024-09-23T06:04:28Z) - Self-Adaptive Physics-Informed Quantum Machine Learning for Solving Differential Equations [0.0]
Chebyshevs have shown significant promise as an efficient tool for both classical and quantum neural networks to solve differential equations.
We adapt and generalize this framework in a quantum machine learning setting for a variety of problems.
Results indicate a promising approach to the near-term evaluation of differential equations on quantum devices.
arXiv Detail & Related papers (2023-12-14T18:46:35Z) - Symbolic Recovery of Differential Equations: The Identifiability Problem [52.158782751264205]
Symbolic recovery of differential equations is the ambitious attempt at automating the derivation of governing equations.
We provide both necessary and sufficient conditions for a function to uniquely determine the corresponding differential equation.
We then use our results to devise numerical algorithms aiming to determine whether a function solves a differential equation uniquely.
arXiv Detail & Related papers (2022-10-15T17:32:49Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Stochastic Scaling in Loss Functions for Physics-Informed Neural
Networks [0.0]
Trained neural networks act as universal function approximators, able to numerically solve differential equations in a novel way.
Variations on traditional loss function and training parameters show promise in making neural network-aided solutions more efficient.
arXiv Detail & Related papers (2022-08-07T17:12:39Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - AutoIP: A United Framework to Integrate Physics into Gaussian Processes [15.108333340471034]
We propose a framework that can integrate all kinds of differential equations into Gaussian processes.
Our method shows improvement upon vanilla GPs in both simulation and several real-world applications.
arXiv Detail & Related papers (2022-02-24T19:02:14Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Neural network representation of the probability density function of
diffusion processes [0.0]
Physics-informed neural networks are developed to characterize the state of dynamical systems in a random environment.
We examine analytically and numerically the advantages and disadvantages of solving each type of differential equation to characterize the state.
arXiv Detail & Related papers (2020-01-15T17:15:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.