Neural network representation of the probability density function of
diffusion processes
- URL: http://arxiv.org/abs/2001.05437v2
- Date: Sun, 19 Apr 2020 20:10:05 GMT
- Title: Neural network representation of the probability density function of
diffusion processes
- Authors: Wayne Isaac Tan Uy, Mircea Grigoriu
- Abstract summary: Physics-informed neural networks are developed to characterize the state of dynamical systems in a random environment.
We examine analytically and numerically the advantages and disadvantages of solving each type of differential equation to characterize the state.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks are developed to characterize the state of
dynamical systems in a random environment. The neural network approximates the
probability density function (pdf) or the characteristic function (chf) of the
state of these systems which satisfy the Fokker-Planck equation or an
integro-differential equation under Gaussian and/or Poisson white noises. We
examine analytically and numerically the advantages and disadvantages of
solving each type of differential equation to characterize the state. It is
also demonstrated how prior information of the dynamical system can be
exploited to design and simplify the neural network architecture. Numerical
examples show that: 1) the neural network solution can approximate the target
solution even for partial integro-differential equations and system of PDEs
describing the time evolution of the pdf/chf, 2) solving either the
Fokker-Planck equation or the chf differential equation using neural networks
yields similar pdfs of the state, and 3) the solution to these differential
equations can be used to study the behavior of the state for different types of
random forcings.
Related papers
- Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems [49.2319247825857]
We show that diffusion-based generative models exhibit many properties favourable for neural operators.
We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training.
arXiv Detail & Related papers (2024-05-11T21:23:55Z) - A Constructive Approach to Function Realization by Neural Stochastic
Differential Equations [8.04975023021212]
We introduce structural restrictions on system dynamics and characterize the class of functions that can be realized by such a system.
The systems are implemented as a cascade interconnection of a neural differential equation (Neural SDE), a deterministic dynamical system, and a readout map.
arXiv Detail & Related papers (2023-07-01T03:44:46Z) - Neuro-symbolic partial differential equation solver [0.0]
We present a strategy for developing mesh-free neuro-symbolic partial differential equation solvers from numerical discretizations found in scientific computing.
This strategy is unique in that it can be used to efficiently train neural network surrogate models for the solution functions and the differential operators.
arXiv Detail & Related papers (2022-10-25T22:56:43Z) - Integral Transforms in a Physics-Informed (Quantum) Neural Network
setting: Applications & Use-Cases [1.7403133838762446]
In many computational problems in engineering and science, function or model differentiation is essential, but also integration is needed.
In this work, we propose to augment the paradigm of Physics-Informed Neural Networks with automatic integration.
arXiv Detail & Related papers (2022-06-28T17:51:32Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Partial Differential Equations is All You Need for Generating Neural Architectures -- A Theory for Physical Artificial Intelligence Systems [40.20472268839781]
We generalize the reaction-diffusion equation in statistical physics, Schr"odinger equation in quantum mechanics, Helmholtz equation in paraxial optics.
We take finite difference method to discretize NPDE for finding numerical solution.
Basic building blocks of deep neural network architecture, including multi-layer perceptron, convolutional neural network and recurrent neural networks, are generated.
arXiv Detail & Related papers (2021-03-10T00:05:46Z) - Autonomous learning of nonlocal stochastic neuron dynamics [0.0]
Neuronal dynamics is driven by externally imposed or internally generated random excitations/noise, and is often described by systems of random or ordinary differential equations.
It can be used to calculate such information-theoretic quantities as the mutual information between the stimulus and various internal states of the neuron.
We propose two methods for closing such equations: a modified nonlocal large-diffusivity closure and a dataeddy closure relying on sparse regression to learn relevant features.
arXiv Detail & Related papers (2020-11-22T06:47:18Z) - Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid
Flow Prediction [79.81193813215872]
We develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.
We show that we can both generalize well to new situations and benefit from the substantial speedup of neural network CFD predictions.
arXiv Detail & Related papers (2020-07-08T21:23:19Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.