A time-stepping deep gradient flow method for option pricing in (rough)
diffusion models
- URL: http://arxiv.org/abs/2403.00746v1
- Date: Fri, 1 Mar 2024 18:46:26 GMT
- Title: A time-stepping deep gradient flow method for option pricing in (rough)
diffusion models
- Authors: Antonis Papapantoleon and Jasper Rou
- Abstract summary: We develop a novel deep learning approach for pricing European options in diffusion models.
The proposed scheme respects the behavior of option prices for large levels of moneyness, and adheres to a priori known bounds for option prices.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We develop a novel deep learning approach for pricing European options in
diffusion models, that can efficiently handle high-dimensional problems
resulting from Markovian approximations of rough volatility models. The option
pricing partial differential equation is reformulated as an energy minimization
problem, which is approximated in a time-stepping fashion by deep artificial
neural networks. The proposed scheme respects the asymptotic behavior of option
prices for large levels of moneyness, and adheres to a priori known bounds for
option prices. The accuracy and efficiency of the proposed method is assessed
in a series of numerical examples, with particular focus in the lifted Heston
model.
Related papers
- Jump Diffusion-Informed Neural Networks with Transfer Learning for Accurate American Option Pricing under Data Scarcity [1.998862666797032]
This study presents a comprehensive framework for American option pricing consisting of six interrelated modules.
The framework combines nonlinear optimization algorithms, analytical and numerical models, and neural networks to improve pricing performance.
The proposed model shows superior performance in pricing deep out-of-the-money options.
arXiv Detail & Related papers (2024-09-26T17:50:12Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Rejection via Learning Density Ratios [50.91522897152437]
Classification with rejection emerges as a learning paradigm which allows models to abstain from making predictions.
We propose a different distributional perspective, where we seek to find an idealized data distribution which maximizes a pretrained model's performance.
Our framework is tested empirically over clean and noisy datasets.
arXiv Detail & Related papers (2024-05-29T01:32:17Z) - A deep implicit-explicit minimizing movement method for option pricing
in jump-diffusion models [0.0]
We develop a novel deep learning approach for pricing European basket options written on assets that follow jump-diffusion dynamics.
The option pricing problem is formulated as a partial integro-differential equation, which is approximated via a new implicit-explicit minimizing movement time-stepping approach.
arXiv Detail & Related papers (2024-01-12T18:21:01Z) - Generative Fractional Diffusion Models [53.36835573822926]
We introduce the first continuous-time score-based generative model that leverages fractional diffusion processes for its underlying dynamics.
Our evaluations on real image datasets demonstrate that GFDM achieves greater pixel-wise diversity and enhanced image quality, as indicated by a lower FID.
arXiv Detail & Related papers (2023-10-26T17:53:24Z) - Fast Diffusion EM: a diffusion model for blind inverse problems with
application to deconvolution [0.0]
Current methods assume the degradation to be known and provide impressive results in terms of restoration and diversity.
In this work, we leverage the efficiency of those models to jointly estimate the restored image and unknown parameters of the kernel model.
Our method alternates between approximating the expected log-likelihood of the problem using samples drawn from a diffusion model and a step to estimate unknown model parameters.
arXiv Detail & Related papers (2023-09-01T06:47:13Z) - Planning with Diffusion for Flexible Behavior Synthesis [125.24438991142573]
We consider what it would look like to fold as much of the trajectory optimization pipeline as possible into the modeling problem.
The core of our technical approach lies in a diffusion probabilistic model that plans by iteratively denoising trajectories.
arXiv Detail & Related papers (2022-05-20T07:02:03Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Neural Options Pricing [0.0]
We treat neural SDEs as universal Ito process approximators.
We compute theoretical option prices numerically.
It is conjectured that the error of the option price implied by the learnt model can be bounded by the Wasserstein distance metric.
arXiv Detail & Related papers (2021-05-27T17:22:30Z) - Arbitrage-free neural-SDE market models [6.145654286950278]
We develop a nonparametric model for the European options book respecting underlying financial constraints.
We study the inference problem where a model is learnt from discrete time series data of stock and option prices.
We use neural networks as function approximators for the drift and diffusion of the modelled SDE system.
arXiv Detail & Related papers (2021-05-24T00:53:10Z) - Learnable Bernoulli Dropout for Bayesian Deep Learning [53.79615543862426]
Learnable Bernoulli dropout (LBD) is a new model-agnostic dropout scheme that considers the dropout rates as parameters jointly optimized with other model parameters.
LBD leads to improved accuracy and uncertainty estimates in image classification and semantic segmentation.
arXiv Detail & Related papers (2020-02-12T18:57:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.