Universal approximation property of neural stochastic differential equations
- URL: http://arxiv.org/abs/2503.16696v1
- Date: Thu, 20 Mar 2025 20:34:23 GMT
- Title: Universal approximation property of neural stochastic differential equations
- Authors: Anna P. Kwossek, David J. Prömel, Josef Teichmann,
- Abstract summary: We identify various classes of neural networks that are able to approximate continuous functions locally uniformly subject to fixed global linear growth constraints.<n>For such neural networks the associated neural differential equations can approximate general differential equations, both of Ito diffusion type, arbitrarily well.
- Score: 2.6490401904186758
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We identify various classes of neural networks that are able to approximate continuous functions locally uniformly subject to fixed global linear growth constraints. For such neural networks the associated neural stochastic differential equations can approximate general stochastic differential equations, both of It\^o diffusion type, arbitrarily well. Moreover, quantitative error estimates are derived for stochastic differential equations with sufficiently regular coefficients.
Related papers
- Generalizing Stochastic Smoothing for Differentiation and Gradient Estimation [59.86921150579892]
We deal with the problem of gradient estimation for differentiable relaxations of algorithms, operators, simulators, and other non-differentiable functions.
We develop variance reduction strategies for differentiable sorting and ranking, differentiable shortest-paths on graphs, differentiable rendering for pose estimation, as well as differentiable cryo-ET simulations.
arXiv Detail & Related papers (2024-10-10T17:10:00Z) - Modify Training Directions in Function Space to Reduce Generalization
Error [9.821059922409091]
We propose a modified natural gradient descent method in the neural network function space based on the eigendecompositions of neural tangent kernel and Fisher information matrix.
We explicitly derive the generalization error of the learned neural network function using theoretical methods from eigendecomposition and statistics theory.
arXiv Detail & Related papers (2023-07-25T07:11:30Z) - On the Generalization and Approximation Capacities of Neural Controlled Differential Equations [0.3222802562733786]
Neural Controlled Differential Equations (NCDEs) are a state-of-the-art tool for supervised learning with irregularly sampled time series.
We show how classical approximation results on neural nets may transfer to NCDEs.
arXiv Detail & Related papers (2023-05-26T10:02:32Z) - Non-Parametric Learning of Stochastic Differential Equations with Non-asymptotic Fast Rates of Convergence [65.63201894457404]
We propose a novel non-parametric learning paradigm for the identification of drift and diffusion coefficients of non-linear differential equations.<n>The key idea essentially consists of fitting a RKHS-based approximation of the corresponding Fokker-Planck equation to such observations.
arXiv Detail & Related papers (2023-05-24T20:43:47Z) - Neuro-symbolic partial differential equation solver [0.0]
We present a strategy for developing mesh-free neuro-symbolic partial differential equation solvers from numerical discretizations found in scientific computing.
This strategy is unique in that it can be used to efficiently train neural network surrogate models for the solution functions and the differential operators.
arXiv Detail & Related papers (2022-10-25T22:56:43Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Infinitely Deep Bayesian Neural Networks with Stochastic Differential
Equations [37.02511585732081]
We perform scalable approximate inference in a recently-proposed family of continuous-depth neural networks.
We demonstrate gradient-based variational inference, producing arbitrarily-flexible approximate posteriors.
This approach further inherits the memory-efficient training and tunable precision of neural ODEs.
arXiv Detail & Related papers (2021-02-12T14:48:58Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Learning To Solve Differential Equations Across Initial Conditions [12.66964917876272]
A number of neural network-based partial differential equation solvers have been formulated which provide performances equivalent, and in some cases even superior, to classical solvers.
In this work, we posit the problem of approximating the solution of a fixed partial differential equation for any arbitrary initial conditions as learning a conditional probability distribution.
arXiv Detail & Related papers (2020-03-26T21:29:22Z) - Neural network representation of the probability density function of
diffusion processes [0.0]
Physics-informed neural networks are developed to characterize the state of dynamical systems in a random environment.
We examine analytically and numerically the advantages and disadvantages of solving each type of differential equation to characterize the state.
arXiv Detail & Related papers (2020-01-15T17:15:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.