Spatiotemporal Adaptive Neural Network for Long-term Forecasting of
Financial Time Series
- URL: http://arxiv.org/abs/2003.12194v3
- Date: Mon, 16 Nov 2020 15:33:30 GMT
- Title: Spatiotemporal Adaptive Neural Network for Long-term Forecasting of
Financial Time Series
- Authors: Philippe Chatigny, Jean-Marc Patenaude, Shengrui Wang
- Abstract summary: We investigate whether deep neural networks (DNNs) can be used to forecast time series (TS) forecasts conjointly.
We make use of the dynamic factor graph (DFG) to build a multivariate autoregressive model.
With ACTM, it is possible to vary the autoregressive order of a TS model over time and model a larger set of probability distributions.
- Score: 0.2793095554369281
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optimal decision-making in social settings is often based on forecasts from
time series (TS) data. Recently, several approaches using deep neural networks
(DNNs) such as recurrent neural networks (RNNs) have been introduced for TS
forecasting and have shown promising results. However, the applicability of
these approaches is being questioned for TS settings where there is a lack of
quality training data and where the TS to forecast exhibit complex behaviors.
Examples of such settings include financial TS forecasting, where producing
accurate and consistent long-term forecasts is notoriously difficult. In this
work, we investigate whether DNN-based models can be used to forecast these TS
conjointly by learning a joint representation of the series instead of
computing the forecast from the raw time-series representations. To this end,
we make use of the dynamic factor graph (DFG) to build a multivariate
autoregressive model. We investigate a common limitation of RNNs that rely on
the DFG framework and propose a novel variable-length attention-based mechanism
(ACTM) to address it. With ACTM, it is possible to vary the autoregressive
order of a TS model over time and model a larger set of probability
distributions than with previous approaches. Using this mechanism, we propose a
self-supervised DNN architecture for multivariate TS forecasting that learns
and takes advantage of the relationships between them. We test our model on two
datasets covering 19 years of investment fund activities. Our experimental
results show that the proposed approach significantly outperforms typical
DNN-based and statistical models at forecasting the 21-day price trajectory. We
point out how improving forecasting accuracy and knowing which forecaster to
use can improve the excess return of autonomous trading strategies.
Related papers
- F-FOMAML: GNN-Enhanced Meta-Learning for Peak Period Demand Forecasting with Proxy Data [65.6499834212641]
We formulate the demand prediction as a meta-learning problem and develop the Feature-based First-Order Model-Agnostic Meta-Learning (F-FOMAML) algorithm.
By considering domain similarities through task-specific metadata, our model improved generalization, where the excess risk decreases as the number of training tasks increases.
Compared to existing state-of-the-art models, our method demonstrates a notable improvement in demand prediction accuracy, reducing the Mean Absolute Error by 26.24% on an internal vending machine dataset and by 1.04% on the publicly accessible JD.com dataset.
arXiv Detail & Related papers (2024-06-23T21:28:50Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Graph-enabled Reinforcement Learning for Time Series Forecasting with
Adaptive Intelligence [11.249626785206003]
We propose a novel approach for predicting time-series data using Graphical neural network (GNN) and monitoring with Reinforcement Learning (RL)
GNNs are able to explicitly incorporate the graph structure of the data into the model, allowing them to capture temporal dependencies in a more natural way.
This approach allows for more accurate predictions in complex temporal structures, such as those found in healthcare, traffic and weather forecasting.
arXiv Detail & Related papers (2023-09-18T22:25:12Z) - DiffSTG: Probabilistic Spatio-Temporal Graph Forecasting with Denoising
Diffusion Models [53.67562579184457]
This paper focuses on probabilistic STG forecasting, which is challenging due to the difficulty in modeling uncertainties and complex dependencies.
We present the first attempt to generalize the popular denoising diffusion models to STGs, leading to a novel non-autoregressive framework called DiffSTG.
Our approach combines the intrinsic-temporal learning capabilities STNNs with the uncertainty measurements of diffusion models.
arXiv Detail & Related papers (2023-01-31T13:42:36Z) - Probabilistic AutoRegressive Neural Networks for Accurate Long-range
Forecasting [6.295157260756792]
We introduce the Probabilistic AutoRegressive Neural Networks (PARNN)
PARNN is capable of handling complex time series data exhibiting non-stationarity, nonlinearity, non-seasonality, long-range dependence, and chaotic patterns.
We evaluate the performance of PARNN against standard statistical, machine learning, and deep learning models, including Transformers, NBeats, and DeepAR.
arXiv Detail & Related papers (2022-04-01T17:57:36Z) - ES-dRNN: A Hybrid Exponential Smoothing and Dilated Recurrent Neural
Network Model for Short-Term Load Forecasting [1.4502611532302039]
Short-term load forecasting (STLF) is challenging due to complex time series (TS)
This paper proposes a novel hybrid hierarchical deep learning model that deals with multiple seasonality.
It combines exponential smoothing (ES) and a recurrent neural network (RNN)
arXiv Detail & Related papers (2021-12-05T19:38:42Z) - Neural forecasting at scale [8.245069318446415]
We study the problem of efficiently scaling ensemble-based deep neural networks for time series (TS) forecasting on a large set of time series.
Our model addresses the practical limitations of related models, reducing the training time by half and memory requirement by a factor of 5.
arXiv Detail & Related papers (2021-09-20T17:22:40Z) - Back2Future: Leveraging Backfill Dynamics for Improving Real-time
Predictions in Future [73.03458424369657]
In real-time forecasting in public health, data collection is a non-trivial and demanding task.
'Backfill' phenomenon and its effect on model performance has been barely studied in the prior literature.
We formulate a novel problem and neural framework Back2Future that aims to refine a given model's predictions in real-time.
arXiv Detail & Related papers (2021-06-08T14:48:20Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Deep Distributional Time Series Models and the Probabilistic Forecasting
of Intraday Electricity Prices [0.0]
We propose two approaches to constructing deep time series probabilistic models.
The first is where the output layer of the ESN has disturbances and a shrinkage prior for additional regularization.
The second approach employs the implicit copula of an ESN with Gaussian disturbances, which is a deep copula process on the feature space.
arXiv Detail & Related papers (2020-10-05T08:02:29Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.