Financial Time Series Forecasting using CNN and Transformer
- URL: http://arxiv.org/abs/2304.04912v1
- Date: Tue, 11 Apr 2023 00:56:57 GMT
- Title: Financial Time Series Forecasting using CNN and Transformer
- Authors: Zhen Zeng, Rachneet Kaur, Suchetha Siddagangappa, Saba Rahimi, Tucker
Balch, Manuela Veloso
- Abstract summary: We propose to harness the power of CNNs and Transformers to model both short-term and long-term dependencies within a time series.
In our experiments, we demonstrated the success of the proposed method in comparison to commonly adopted statistical and deep learning methods.
- Score: 16.57996422431636
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series forecasting is important across various domains for
decision-making. In particular, financial time series such as stock prices can
be hard to predict as it is difficult to model short-term and long-term
temporal dependencies between data points. Convolutional Neural Networks (CNN)
are good at capturing local patterns for modeling short-term dependencies.
However, CNNs cannot learn long-term dependencies due to the limited receptive
field. Transformers on the other hand are capable of learning global context
and long-term dependencies. In this paper, we propose to harness the power of
CNNs and Transformers to model both short-term and long-term dependencies
within a time series, and forecast if the price would go up, down or remain the
same (flat) in the future. In our experiments, we demonstrated the success of
the proposed method in comparison to commonly adopted statistical and deep
learning methods on forecasting intraday stock price change of S&P 500
constituents.
Related papers
- TimeBridge: Non-Stationarity Matters for Long-term Time Series Forecasting [49.6208017412376]
TimeBridge is a novel framework designed to bridge the gap between non-stationarity and dependency modeling.
TimeBridge consistently achieves state-of-the-art performance in both short-term and long-term forecasting.
arXiv Detail & Related papers (2024-10-06T10:41:03Z) - FAITH: Frequency-domain Attention In Two Horizons for Time Series Forecasting [13.253624747448935]
Time Series Forecasting plays a crucial role in various fields such as industrial equipment maintenance, meteorology, energy consumption, traffic flow and financial investment.
Current deep learning-based predictive models often exhibit a significant deviation between their forecasting outcomes and the ground truth.
We propose a novel model Frequency-domain Attention In Two Horizons, which decomposes time series into trend and seasonal components.
arXiv Detail & Related papers (2024-05-22T02:37:02Z) - Stock Trend Prediction: A Semantic Segmentation Approach [3.718476964451589]
We present a novel approach to predict long-term daily stock price change trends with fully 2D-convolutional encoder-decoders.
Our hierarchical structure of CNNs makes it capable of capturing both long and short-term temporal relationships effectively.
arXiv Detail & Related papers (2023-03-09T01:29:09Z) - Grouped self-attention mechanism for a memory-efficient Transformer [64.0125322353281]
Real-world tasks such as forecasting weather, electricity consumption, and stock market involve predicting data that vary over time.
Time-series data are generally recorded over a long period of observation with long sequences owing to their periodic characteristics and long-range dependencies over time.
We propose two novel modules, Grouped Self-Attention (GSA) and Compressed Cross-Attention (CCA)
Our proposed model efficiently exhibited reduced computational complexity and performance comparable to or better than existing methods.
arXiv Detail & Related papers (2022-10-02T06:58:49Z) - Long Short-Term Memory Neural Network for Financial Time Series [0.0]
We present an ensemble of independent and parallel long short-term memory neural networks for the prediction of stock price movement.
With a straightforward trading strategy, comparisons with a randomly chosen portfolio and a portfolio containing all the stocks in the index show that the portfolio resulting from the LSTM ensemble provides better average daily returns and higher cumulative returns over time.
arXiv Detail & Related papers (2022-01-20T15:17:26Z) - Bilinear Input Normalization for Neural Networks in Financial
Forecasting [101.89872650510074]
We propose a novel data-driven normalization method for deep neural networks that handle high-frequency financial time-series.
The proposed normalization scheme takes into account the bimodal characteristic of financial time-series.
Our experiments, conducted with state-of-the-arts neural networks and high-frequency data, show significant improvements over other normalization techniques.
arXiv Detail & Related papers (2021-09-01T07:52:03Z) - Low-Rank Temporal Attention-Augmented Bilinear Network for financial
time-series forecasting [93.73198973454944]
Deep learning models have led to significant performance improvements in many problems coming from different domains, including prediction problems of financial time-series data.
The Temporal Attention-Augmented Bilinear network was recently proposed as an efficient and high-performing model for Limit Order Book time-series forecasting.
In this paper, we propose a low-rank tensor approximation of the model to further reduce the number of trainable parameters and increase its speed.
arXiv Detail & Related papers (2021-07-05T10:15:23Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Spatiotemporal Adaptive Neural Network for Long-term Forecasting of
Financial Time Series [0.2793095554369281]
We investigate whether deep neural networks (DNNs) can be used to forecast time series (TS) forecasts conjointly.
We make use of the dynamic factor graph (DFG) to build a multivariate autoregressive model.
With ACTM, it is possible to vary the autoregressive order of a TS model over time and model a larger set of probability distributions.
arXiv Detail & Related papers (2020-03-27T00:53:11Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.