Variational Heteroscedastic Volatility Model
- URL: http://arxiv.org/abs/2204.05806v1
- Date: Mon, 11 Apr 2022 15:29:46 GMT
- Title: Variational Heteroscedastic Volatility Model
- Authors: Zexuan Yin, Paolo Barucca
- Abstract summary: We propose an end-to-end neural network architecture capable of modelling heteroscedastic behaviour in financial time series.
VHVM consists of a variational autoencoder to capture relationships between assets, and a recurrent neural network to model the time-evolution of these dependencies.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose Variational Heteroscedastic Volatility Model (VHVM) -- an
end-to-end neural network architecture capable of modelling heteroscedastic
behaviour in multivariate financial time series. VHVM leverages recent advances
in several areas of deep learning, namely sequential modelling and
representation learning, to model complex temporal dynamics between different
asset returns. At its core, VHVM consists of a variational autoencoder to
capture relationships between assets, and a recurrent neural network to model
the time-evolution of these dependencies. The outputs of VHVM are time-varying
conditional volatilities in the form of covariance matrices. We demonstrate the
effectiveness of VHVM against existing methods such as Generalised
AutoRegressive Conditional Heteroscedasticity (GARCH) and Stochastic Volatility
(SV) models on a wide range of multivariate foreign currency (FX) datasets.
Related papers
- PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Copula Variational LSTM for High-dimensional Cross-market Multivariate
Dependence Modeling [46.75628526959982]
We make the first attempt to integrate variational sequential neural learning with copula-based dependence modeling.
Our variational neural network WPVC-VLSTM models variational sequential dependence degrees and structures across time series.
It outperforms benchmarks including linear models, volatility models, deep neural networks, and variational recurrent networks in cross-market portfolio forecasting.
arXiv Detail & Related papers (2023-05-09T08:19:08Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Neural Generalised AutoRegressive Conditional Heteroskedasticity [0.0]
We propose Neural GARCH, a class of methods to model conditional heteroskedasticity in financial time series.
We allow the coefficients of a GARCH model to be time varying in order to reflect the constantly changing dynamics of financial markets.
We propose two variants of our model, one with normal innovations and the other with Students t innovations.
We find that the Neural Students t model consistently outperforms the others.
arXiv Detail & Related papers (2022-02-23T03:23:02Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Variational Dynamic Mixtures [18.730501689781214]
We develop variational dynamic mixtures (VDM) to infer sequential latent variables.
In an empirical study, we show that VDM outperforms competing approaches on highly multi-modal datasets.
arXiv Detail & Related papers (2020-10-20T16:10:07Z) - Parsimonious Quantile Regression of Financial Asset Tail Dynamics via
Sequential Learning [35.34574502348672]
We propose a parsimonious quantile regression framework to learn the dynamic tail behaviors of financial asset returns.
Our model captures well both the time-varying characteristic and the asymmetrical heavy-tail property of financial time series.
arXiv Detail & Related papers (2020-10-16T09:35:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.