Recurrent Conditional Heteroskedasticity
- URL: http://arxiv.org/abs/2010.13061v2
- Date: Sat, 22 Jan 2022 02:41:44 GMT
- Title: Recurrent Conditional Heteroskedasticity
- Authors: T.-N. Nguyen, M.-N. Tran, and R. Kohn
- Abstract summary: We propose a new class of financial volatility models, called the REcurrent Conditional Heteroskedastic (RECH) models.
In particular, we incorporate auxiliary deterministic processes, governed by recurrent neural networks, into the conditional variance of the traditional conditional heteroskedastic models.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new class of financial volatility models, called the REcurrent
Conditional Heteroskedastic (RECH) models, to improve both in-sample analysis
and out-ofsample forecasting of the traditional conditional heteroskedastic
models. In particular, we incorporate auxiliary deterministic processes,
governed by recurrent neural networks, into the conditional variance of the
traditional conditional heteroskedastic models, e.g. GARCH-type models, to
flexibly capture the dynamics of the underlying volatility. RECH models can
detect interesting effects in financial volatility overlooked by the existing
conditional heteroskedastic models such as the GARCH, GJR and EGARCH. The new
models often have good out-of-sample forecasts while still explaining well the
stylized facts of financial volatility by retaining the well-established
features of econometric GARCH-type models. These properties are illustrated
through simulation studies and applications to thirty-one stock indices and
exchange rate data. . An user-friendly software package together with the
examples reported in the paper are available at https://github.com/vbayeslab.
Related papers
- On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - GARCH-Informed Neural Networks for Volatility Prediction in Financial Markets [0.0]
We present a new, hybrid Deep Learning model that captures and forecasting market volatility more accurately than either class of models are capable of on their own.
When compared to other time series models, GINN showed superior out-of-sample prediction performance in terms of the Coefficient of Determination ($R2$), Mean Squared Error (MSE), and Mean Absolute Error (MAE)
arXiv Detail & Related papers (2024-09-30T23:53:54Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Volatility Based Kernels and Moving Average Means for Accurate
Forecasting with Gaussian Processes [36.712632126776285]
We show how to re-cast a class of volatility models as a hierarchical Gaussian process (GP) model with specialized covariance functions.
Within this framework, we take inspiration from well studied domains to introduce a new class of models, Volt and Magpie, that significantly outperform baselines in stock and wind speed forecasting.
arXiv Detail & Related papers (2022-07-13T23:02:54Z) - Forecasting High-Dimensional Covariance Matrices of Asset Returns with
Hybrid GARCH-LSTMs [0.0]
This paper investigates the ability of hybrid models, mixing GARCH processes and neural networks, to forecast covariance matrices of asset returns.
The new model proposed is very promising as it not only outperforms the equally weighted portfolio, but also by a significant margin its econometric counterpart.
arXiv Detail & Related papers (2021-08-25T23:41:43Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - A Data-driven Market Simulator for Small Data Environments [0.5872014229110214]
Neural network based data-driven market simulation unveils a new and flexible way of modelling financial time series.
We show how a rough paths perspective combined with a parsimonious Variational Autoencoder framework provides a powerful way for encoding and evaluating financial time series.
arXiv Detail & Related papers (2020-06-21T14:04:21Z) - A generative adversarial network approach to calibration of local
stochastic volatility models [2.1485350418225244]
We propose a fully data-driven approach to calibrate local volatility (LSV) models.
We parametrize the leverage function by a family of feed-forward neural networks and learn their parameters directly from the available market option prices.
This should be seen in the context of neural SDEs and (causal) generative adversarial networks.
arXiv Detail & Related papers (2020-05-05T21:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.