A Data-driven Market Simulator for Small Data Environments
- URL: http://arxiv.org/abs/2006.14498v1
- Date: Sun, 21 Jun 2020 14:04:21 GMT
- Title: A Data-driven Market Simulator for Small Data Environments
- Authors: Hans B\"uhler, Blanka Horvath, Terry Lyons, Imanol Perez Arribas, and
Ben Wood
- Abstract summary: Neural network based data-driven market simulation unveils a new and flexible way of modelling financial time series.
We show how a rough paths perspective combined with a parsimonious Variational Autoencoder framework provides a powerful way for encoding and evaluating financial time series.
- Score: 0.5872014229110214
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural network based data-driven market simulation unveils a new and flexible
way of modelling financial time series without imposing assumptions on the
underlying stochastic dynamics. Though in this sense generative market
simulation is model-free, the concrete modelling choices are nevertheless
decisive for the features of the simulated paths. We give a brief overview of
currently used generative modelling approaches and performance evaluation
metrics for financial time series, and address some of the challenges to
achieve good results in the latter. We also contrast some classical approaches
of market simulation with simulation based on generative modelling and
highlight some advantages and pitfalls of the new approach. While most
generative models tend to rely on large amounts of training data, we present
here a generative model that works reliably in environments where the amount of
available training data is notoriously small. Furthermore, we show how a rough
paths perspective combined with a parsimonious Variational Autoencoder
framework provides a powerful way for encoding and evaluating financial time
series in such environments where available training data is scarce. Finally,
we also propose a suitable performance evaluation metric for financial time
series and discuss some connections of our Market Generator to deep hedging.
Related papers
- Recurrent Neural Goodness-of-Fit Test for Time Series [8.22915954499148]
Time series data are crucial across diverse domains such as finance and healthcare.
Traditional evaluation metrics fall short due to the temporal dependencies and potential high dimensionality of the features.
We propose the REcurrent NeurAL (RENAL) Goodness-of-Fit test, a novel and statistically rigorous framework for evaluating generative time series models.
arXiv Detail & Related papers (2024-10-17T19:32:25Z) - Learning Augmentation Policies from A Model Zoo for Time Series Forecasting [58.66211334969299]
We introduce AutoTSAug, a learnable data augmentation method based on reinforcement learning.
By augmenting the marginal samples with a learnable policy, AutoTSAug substantially improves forecasting performance.
arXiv Detail & Related papers (2024-09-10T07:34:19Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Deep Generative Modeling for Financial Time Series with Application in
VaR: A Comparative Review [22.52651841623703]
Historical simulation (HS) uses the empirical distribution of daily returns in a historical window as the forecast distribution of risk factor returns in the next day.
HS, GARCH and CWGAN models are tested on both historical USD yield curve data and additional data simulated from GARCH and CIR processes.
The study shows that top performing models are HS, GARCH and CWGAN models.
arXiv Detail & Related papers (2024-01-18T20:35:32Z) - STORM: Efficient Stochastic Transformer based World Models for
Reinforcement Learning [82.03481509373037]
Recently, model-based reinforcement learning algorithms have demonstrated remarkable efficacy in visual input environments.
We introduce Transformer-based wORld Model (STORM), an efficient world model architecture that combines strong modeling and generation capabilities.
Storm achieves a mean human performance of $126.7%$ on the Atari $100$k benchmark, setting a new record among state-of-the-art methods.
arXiv Detail & Related papers (2023-10-14T16:42:02Z) - Synthetic Model Combination: An Instance-wise Approach to Unsupervised
Ensemble Learning [92.89846887298852]
Consider making a prediction over new test data without any opportunity to learn from a training set of labelled data.
Give access to a set of expert models and their predictions alongside some limited information about the dataset used to train them.
arXiv Detail & Related papers (2022-10-11T10:20:31Z) - Long-term stability and generalization of observationally-constrained
stochastic data-driven models for geophysical turbulence [0.19686770963118383]
Deep learning models can mitigate certain biases in current state-of-the-art weather models.
Data-driven models require a lot of training data which may not be available from reanalysis (observational data) products.
deterministic data-driven forecasting models suffer from issues with long-term stability and unphysical climate drift.
We propose a convolutional variational autoencoder-based data-driven model that is pre-trained on an imperfect climate model simulation.
arXiv Detail & Related papers (2022-05-09T23:52:37Z) - Black-box Bayesian inference for economic agent-based models [0.0]
We investigate the efficacy of two classes of black-box approximate Bayesian inference methods.
We demonstrate that neural network based black-box methods provide state of the art parameter inference for economic simulation models.
arXiv Detail & Related papers (2022-02-01T18:16:12Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z) - Recurrent Conditional Heteroskedasticity [0.0]
We propose a new class of financial volatility models, called the REcurrent Conditional Heteroskedastic (RECH) models.
In particular, we incorporate auxiliary deterministic processes, governed by recurrent neural networks, into the conditional variance of the traditional conditional heteroskedastic models.
arXiv Detail & Related papers (2020-10-25T08:09:29Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.