Copula Variational LSTM for High-dimensional Cross-market Multivariate
Dependence Modeling
- URL: http://arxiv.org/abs/2305.08778v1
- Date: Tue, 9 May 2023 08:19:08 GMT
- Title: Copula Variational LSTM for High-dimensional Cross-market Multivariate
Dependence Modeling
- Authors: Jia Xu and Longbing Cao
- Abstract summary: We make the first attempt to integrate variational sequential neural learning with copula-based dependence modeling.
Our variational neural network WPVC-VLSTM models variational sequential dependence degrees and structures across time series.
It outperforms benchmarks including linear models, volatility models, deep neural networks, and variational recurrent networks in cross-market portfolio forecasting.
- Score: 46.75628526959982
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We address an important yet challenging problem - modeling high-dimensional
dependencies across multivariates such as financial indicators in heterogeneous
markets. In reality, a market couples and influences others over time, and the
financial variables of a market are also coupled. We make the first attempt to
integrate variational sequential neural learning with copula-based dependence
modeling to characterize both temporal observable and latent variable-based
dependence degrees and structures across non-normal multivariates. Our
variational neural network WPVC-VLSTM models variational sequential dependence
degrees and structures across multivariate time series by variational long
short-term memory networks and regular vine copula. The regular vine copula
models nonnormal and long-range distributional couplings across multiple
dynamic variables. WPVC-VLSTM is verified in terms of both technical
significance and portfolio forecasting performance. It outperforms benchmarks
including linear models, stochastic volatility models, deep neural networks,
and variational recurrent networks in cross-market portfolio forecasting.
Related papers
- UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - TACTiS-2: Better, Faster, Simpler Attentional Copulas for Multivariate Time Series [57.4208255711412]
Building on copula theory, we propose a simplified objective for the recently-introduced transformer-based attentional copulas (TACTiS)
We show that the resulting model has significantly better training dynamics and achieves state-of-the-art performance across diverse real-world forecasting tasks.
arXiv Detail & Related papers (2023-10-02T16:45:19Z) - Learning multi-modal generative models with permutation-invariant encoders and tighter variational objectives [5.549794481031468]
Devising deep latent variable models for multi-modal data has been a long-standing theme in machine learning research.
In this work, we consider a variational objective that can tightly approximate the data log-likelihood.
We develop more flexible aggregation schemes that avoid the inductive biases in PoE or MoE approaches.
arXiv Detail & Related papers (2023-09-01T10:32:21Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Variational Heteroscedastic Volatility Model [0.0]
We propose an end-to-end neural network architecture capable of modelling heteroscedastic behaviour in financial time series.
VHVM consists of a variational autoencoder to capture relationships between assets, and a recurrent neural network to model the time-evolution of these dependencies.
arXiv Detail & Related papers (2022-04-11T15:29:46Z) - Trading with the Momentum Transformer: An Intelligent and Interpretable
Architecture [2.580765958706854]
We introduce the Momentum Transformer, an attention-based architecture which outperforms the benchmarks.
We observe remarkable structure in the attention patterns, with significant peaks of importance at momentum turning points.
Through the addition of an interpretable variable selection network, we observe how CPD helps our model to move away from trading predominantly on daily returns data.
arXiv Detail & Related papers (2021-12-16T00:04:12Z) - CLVSA: A Convolutional LSTM Based Variational Sequence-to-Sequence Model
with Attention for Predicting Trends of Financial Markets [12.020797636494267]
We propose CLVSA, a hybrid model that captures variationally underlying features in raw financial trading data.
Our model outperforms basic models, such as convolutional neural network, vanilla LSTM network, and sequence-to-sequence model with attention.
Our experimental results show that, by introducing an approximate posterior, CLVSA takes advantage of an extra regularizer based on the Kullback-Leibler divergence to prevent itself from overfitting traps.
arXiv Detail & Related papers (2021-04-08T20:31:04Z) - Variational Dynamic Mixtures [18.730501689781214]
We develop variational dynamic mixtures (VDM) to infer sequential latent variables.
In an empirical study, we show that VDM outperforms competing approaches on highly multi-modal datasets.
arXiv Detail & Related papers (2020-10-20T16:10:07Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.