Integral Transforms in a Physics-Informed (Quantum) Neural Network
setting: Applications & Use-Cases
- URL: http://arxiv.org/abs/2206.14184v1
- Date: Tue, 28 Jun 2022 17:51:32 GMT
- Title: Integral Transforms in a Physics-Informed (Quantum) Neural Network
setting: Applications & Use-Cases
- Authors: Niraj Kumar, Evan Philip, Vincent E. Elfving
- Abstract summary: In many computational problems in engineering and science, function or model differentiation is essential, but also integration is needed.
In this work, we propose to augment the paradigm of Physics-Informed Neural Networks with automatic integration.
- Score: 1.7403133838762446
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In many computational problems in engineering and science, function or model
differentiation is essential, but also integration is needed. An important
class of computational problems include so-called integro-differential
equations which include both integrals and derivatives of a function. In
another example, stochastic differential equations can be written in terms of a
partial differential equation of a probability density function of the
stochastic variable. To learn characteristics of the stochastic variable based
on the density function, specific integral transforms, namely moments, of the
density function need to be calculated. Recently, the machine learning paradigm
of Physics-Informed Neural Networks emerged with increasing popularity as a
method to solve differential equations by leveraging automatic differentiation.
In this work, we propose to augment the paradigm of Physics-Informed Neural
Networks with automatic integration in order to compute complex integral
transforms on trained solutions, and to solve integro-differential equations
where integrals are computed on-the-fly during training. Furthermore, we
showcase the techniques in various application settings, numerically simulating
quantum computer-based neural networks as well as classical neural networks.
Related papers
- Physics-Informed Quantum Machine Learning for Solving Partial
Differential Equations [0.0]
We propose a tensor product over a summation of Pauli-Z operators as a change in the measurement observables.
This idea has been tested on solving the complex dynamics of a Riccati equation.
A new quantum circuit structure is proposed to approximate multivariable functions, tested on solving a 2D Poisson's equation.
arXiv Detail & Related papers (2023-12-14T18:46:35Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Stochastic Scaling in Loss Functions for Physics-Informed Neural
Networks [0.0]
Trained neural networks act as universal function approximators, able to numerically solve differential equations in a novel way.
Variations on traditional loss function and training parameters show promise in making neural network-aided solutions more efficient.
arXiv Detail & Related papers (2022-08-07T17:12:39Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - AutoIP: A United Framework to Integrate Physics into Gaussian Processes [15.108333340471034]
We propose a framework that can integrate all kinds of differential equations into Gaussian processes.
Our method shows improvement upon vanilla GPs in both simulation and several real-world applications.
arXiv Detail & Related papers (2022-02-24T19:02:14Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Partial Differential Equations is All You Need for Generating Neural
Architectures -- A Theory for Physical Artificial Intelligence Systems [29.667065357274385]
We generalize the reaction-diffusion equation in statistical physics, Schr"odinger equation in quantum mechanics, Helmholtz equation in paraxial optics.
We take finite difference method to discretize NPDE for finding numerical solution.
Basic building blocks of deep neural network architecture, including multi-layer perceptron, convolutional neural network and recurrent neural networks, are generated.
arXiv Detail & Related papers (2021-03-10T00:05:46Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Neural network representation of the probability density function of
diffusion processes [0.0]
Physics-informed neural networks are developed to characterize the state of dynamical systems in a random environment.
We examine analytically and numerically the advantages and disadvantages of solving each type of differential equation to characterize the state.
arXiv Detail & Related papers (2020-01-15T17:15:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.