Predicting path-dependent processes by deep learning
- URL: http://arxiv.org/abs/2408.09941v1
- Date: Mon, 19 Aug 2024 12:24:25 GMT
- Title: Predicting path-dependent processes by deep learning
- Authors: Xudong Zheng, Yuecai Han,
- Abstract summary: We investigate a deep learning method for predicting path-dependent processes based on discretely observed historical information.
With the frequency of discrete observations tending to infinity, the predictions based on discrete observations converge to the predictions based on continuous observations.
We apply the method to the fractional Brownian motion and the fractional O-Uhlenbeck process as examples.
- Score: 0.5893124686141782
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we investigate a deep learning method for predicting path-dependent processes based on discretely observed historical information. This method is implemented by considering the prediction as a nonparametric regression and obtaining the regression function through simulated samples and deep neural networks. When applying this method to fractional Brownian motion and the solutions of some stochastic differential equations driven by it, we theoretically proved that the $L_2$ errors converge to 0, and we further discussed the scope of the method. With the frequency of discrete observations tending to infinity, the predictions based on discrete observations converge to the predictions based on continuous observations, which implies that we can make approximations by the method. We apply the method to the fractional Brownian motion and the fractional Ornstein-Uhlenbeck process as examples. Comparing the results with the theoretical optimal predictions and taking the mean square error as a measure, the numerical simulations demonstrate that the method can generate accurate results. We also analyze the impact of factors such as prediction period, Hurst index, etc. on the accuracy.
Related papers
- Empirical fits to inclusive electron-carbon scattering data obtained by deep-learning methods [0.0]
We obtain empirical fits to the electron-scattering cross sections for carbon over a broad kinematic region.
We consider two different methods of obtaining such model-independent parametrizations and the corresponding uncertainties.
arXiv Detail & Related papers (2023-12-28T17:03:17Z) - Posterior and Computational Uncertainty in Gaussian Processes [52.26904059556759]
Gaussian processes scale prohibitively with the size of the dataset.
Many approximation methods have been developed, which inevitably introduce approximation error.
This additional source of uncertainty, due to limited computation, is entirely ignored when using the approximate posterior.
We develop a new class of methods that provides consistent estimation of the combined uncertainty arising from both the finite number of data observed and the finite amount of computation expended.
arXiv Detail & Related papers (2022-05-30T22:16:25Z) - Probabilistic Estimation of Chirp Instantaneous Frequency Using Gaussian
Processes [4.150253997298207]
We present a probabilistic approach for estimating signal and its instantaneous frequency function when the true forms of the chirp and instantaneous frequency are unknown.
Experiments show that the method outperforms a number of baseline methods on a synthetic model, and we also apply it to analyse a gravitational wave data.
arXiv Detail & Related papers (2022-05-12T18:47:13Z) - A blob method method for inhomogeneous diffusion with applications to
multi-agent control and sampling [0.6562256987706128]
We develop a deterministic particle method for the weighted porous medium equation (WPME) and prove its convergence on bounded time intervals.
Our method has natural applications to multi-agent coverage algorithms and sampling probability measures.
arXiv Detail & Related papers (2022-02-25T19:49:05Z) - Heavy-tailed Streaming Statistical Estimation [58.70341336199497]
We consider the task of heavy-tailed statistical estimation given streaming $p$ samples.
We design a clipped gradient descent and provide an improved analysis under a more nuanced condition on the noise of gradients.
arXiv Detail & Related papers (2021-08-25T21:30:27Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - The Shooting Regressor; Randomized Gradient-Based Ensembles [0.0]
An ensemble method is introduced that utilizes randomization and loss function gradients to compute a prediction.
Multiple weakly-correlated estimators approximate the gradient at randomly sampled points on the error surface and are aggregated into a final solution.
arXiv Detail & Related papers (2020-09-14T03:20:59Z) - Path Sample-Analytic Gradient Estimators for Stochastic Binary Networks [78.76880041670904]
In neural networks with binary activations and or binary weights the training by gradient descent is complicated.
We propose a new method for this estimation problem combining sampling and analytic approximation steps.
We experimentally show higher accuracy in gradient estimation and demonstrate a more stable and better performing training in deep convolutional models.
arXiv Detail & Related papers (2020-06-04T21:51:21Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z) - Maximum likelihood estimation and uncertainty quantification for
Gaussian process approximation of deterministic functions [10.319367855067476]
This article provides one of the first theoretical analyses in the context of Gaussian process regression with a noiseless dataset.
We show that the maximum likelihood estimation of the scale parameter alone provides significant adaptation against misspecification of the Gaussian process model.
arXiv Detail & Related papers (2020-01-29T17:20:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.