Parsimonious Quantile Regression of Financial Asset Tail Dynamics via
Sequential Learning
- URL: http://arxiv.org/abs/2010.08263v1
- Date: Fri, 16 Oct 2020 09:35:52 GMT
- Title: Parsimonious Quantile Regression of Financial Asset Tail Dynamics via
Sequential Learning
- Authors: Xing Yan, Weizhong Zhang, Lin Ma, Wei Liu, Qi Wu
- Abstract summary: We propose a parsimonious quantile regression framework to learn the dynamic tail behaviors of financial asset returns.
Our model captures well both the time-varying characteristic and the asymmetrical heavy-tail property of financial time series.
- Score: 35.34574502348672
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a parsimonious quantile regression framework to learn the dynamic
tail behaviors of financial asset returns. Our model captures well both the
time-varying characteristic and the asymmetrical heavy-tail property of
financial time series. It combines the merits of a popular sequential neural
network model, i.e., LSTM, with a novel parametric quantile function that we
construct to represent the conditional distribution of asset returns. Our model
also captures individually the serial dependences of higher moments, rather
than just the volatility. Across a wide range of asset classes, the
out-of-sample forecasts of conditional quantiles or VaR of our model outperform
the GARCH family. Further, the proposed approach does not suffer from the issue
of quantile crossing, nor does it expose to the ill-posedness comparing to the
parametric probability density function approach.
Related papers
- Generalized Distribution Prediction for Asset Returns [0.9944647907864256]
We present a novel approach for predicting the distribution of asset returns using a quantile-based method with Long Short-Term Memory (LSTM) networks.
Our model is designed in two stages: the first focuses on predicting the quantiles of normalized asset returns using asset-specific features, while the second stage incorporates market data to adjust these predictions for broader economic conditions.
arXiv Detail & Related papers (2024-10-15T15:31:44Z) - Consensus-Adaptive RANSAC [104.87576373187426]
We propose a new RANSAC framework that learns to explore the parameter space by considering the residuals seen so far via a novel attention layer.
The attention mechanism operates on a batch of point-to-model residuals, and updates a per-point estimation state to take into account the consensus found through a lightweight one-step transformer.
arXiv Detail & Related papers (2023-07-26T08:25:46Z) - Adaptive Conditional Quantile Neural Processes [9.066817971329899]
Conditional Quantile Neural Processes (CQNPs) are a new member of the neural processes family.
We introduce an extension of quantile regression where the model learns to focus on estimating informative quantiles.
Experiments with real and synthetic datasets demonstrate substantial improvements in predictive performance.
arXiv Detail & Related papers (2023-05-30T06:19:19Z) - Bayesian predictive modeling of multi-source multi-way data [0.0]
We consider molecular data from multiple 'omics sources as predictors of early-life iron deficiency (ID) in a rhesus monkey model.
We use a linear model with a low-rank structure on the coefficients to capture multi-way dependence.
We show that our model performs as expected in terms of misclassification rates and correlation of estimated coefficients with true coefficients.
arXiv Detail & Related papers (2022-08-05T21:58:23Z) - Time varying regression with hidden linear dynamics [74.9914602730208]
We revisit a model for time-varying linear regression that assumes the unknown parameters evolve according to a linear dynamical system.
Counterintuitively, we show that when the underlying dynamics are stable the parameters of this model can be estimated from data by combining just two ordinary least squares estimates.
arXiv Detail & Related papers (2021-12-29T23:37:06Z) - Forecasting High-Dimensional Covariance Matrices of Asset Returns with
Hybrid GARCH-LSTMs [0.0]
This paper investigates the ability of hybrid models, mixing GARCH processes and neural networks, to forecast covariance matrices of asset returns.
The new model proposed is very promising as it not only outperforms the equally weighted portfolio, but also by a significant margin its econometric counterpart.
arXiv Detail & Related papers (2021-08-25T23:41:43Z) - Dynamic Gaussian Mixture based Deep Generative Model For Robust
Forecasting on Sparse Multivariate Time Series [43.86737761236125]
We propose a novel generative model, which tracks the transition of latent clusters, instead of isolated feature representations.
It is characterized by a newly designed dynamic Gaussian mixture distribution, which captures the dynamics of clustering structures.
A structured inference network is also designed for enabling inductive analysis.
arXiv Detail & Related papers (2021-03-03T04:10:07Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.