Data Scaling Effect of Deep Learning in Financial Time Series Forecasting
- URL: http://arxiv.org/abs/2309.02072v5
- Date: Sat, 1 Jun 2024 03:40:12 GMT
- Title: Data Scaling Effect of Deep Learning in Financial Time Series Forecasting
- Authors: Chen Liu, Minh-Ngoc Tran, Chao Wang, Richard Gerlach, Robert Kohn,
- Abstract summary: This study highlights the importance of global training, where the deep learning model is optimized across a wide spectrum of stocks.
We show that a globally trained deep learning model is capable of delivering accurate zero-shot forecasts for any stocks.
- Score: 5.299784478982814
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: For years, researchers investigated the applications of deep learning in forecasting financial time series. However, they continued to rely on the conventional econometric approach for model training that optimizes the deep learning models on individual assets. This study highlights the importance of global training, where the deep learning model is optimized across a wide spectrum of stocks. Focusing on stock volatility forecasting as an exemplar, we show that global training is not only beneficial but also necessary for deep learning-based financial time series forecasting. We further demonstrate that, given a sufficient amount of training data, a globally trained deep learning model is capable of delivering accurate zero-shot forecasts for any stocks.
Related papers
- Predicting Practically? Domain Generalization for Predictive Analytics in Real-world Environments [18.086130222010496]
We propose a novel domain generalization method tailored to handle complex distribution shifts.
Our method builds upon the Distributionally Robust Optimization framework, optimizing model performance over a set of hypothetical worst-case distributions.
We discuss the broader implications of our method for advancing Information Systems (IS) design research.
arXiv Detail & Related papers (2025-03-05T11:21:37Z) - Meta-Statistical Learning: Supervised Learning of Statistical Inference [59.463430294611626]
This work demonstrates that the tools and principles driving the success of large language models (LLMs) can be repurposed to tackle distribution-level tasks.
We propose meta-statistical learning, a framework inspired by multi-instance learning that reformulates statistical inference tasks as supervised learning problems.
arXiv Detail & Related papers (2025-02-17T18:04:39Z) - Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - GIFT-Eval: A Benchmark For General Time Series Forecasting Model Evaluation [90.53485251837235]
Time series foundation models excel in zero-shot forecasting, handling diverse tasks without explicit training.
GIFT-Eval is a pioneering benchmark aimed at promoting evaluation across diverse datasets.
GIFT-Eval encompasses 23 datasets over 144,000 time series and 177 million data points.
arXiv Detail & Related papers (2024-10-14T11:29:38Z) - An Evaluation of Deep Learning Models for Stock Market Trend Prediction [0.3277163122167433]
This study investigates the efficacy of advanced deep learning models for short-term trend forecasting using daily and hourly closing prices from the S&P 500 index and the Brazilian ETF EWZ.
We introduce the Extended Long Short-Term Memory for Time Series (xLSTM-TS) model, an xLSTM adaptation optimised for time series prediction.
Among the models tested, xLSTM-TS consistently outperformed others. For example, it achieved a test accuracy of 72.82% and an F1 score of 73.16% on the EWZ daily dataset.
arXiv Detail & Related papers (2024-08-22T13:58:55Z) - F-FOMAML: GNN-Enhanced Meta-Learning for Peak Period Demand Forecasting with Proxy Data [65.6499834212641]
We formulate the demand prediction as a meta-learning problem and develop the Feature-based First-Order Model-Agnostic Meta-Learning (F-FOMAML) algorithm.
By considering domain similarities through task-specific metadata, our model improved generalization, where the excess risk decreases as the number of training tasks increases.
Compared to existing state-of-the-art models, our method demonstrates a notable improvement in demand prediction accuracy, reducing the Mean Absolute Error by 26.24% on an internal vending machine dataset and by 1.04% on the publicly accessible JD.com dataset.
arXiv Detail & Related papers (2024-06-23T21:28:50Z) - Enhancing Mean-Reverting Time Series Prediction with Gaussian Processes:
Functional and Augmented Data Structures in Financial Forecasting [0.0]
We explore the application of Gaussian Processes (GPs) for predicting mean-reverting time series with an underlying structure.
GPs offer the potential to forecast not just the average prediction but the entire probability distribution over a future trajectory.
This is particularly beneficial in financial contexts, where accurate predictions alone may not suffice if incorrect volatility assessments lead to capital losses.
arXiv Detail & Related papers (2024-02-23T06:09:45Z) - Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - From Reactive to Proactive Volatility Modeling with Hemisphere Neural Networks [0.0]
We reinvigorate maximum likelihood estimation (MLE) for macroeconomic density forecasting through a novel neural network architecture with dedicated mean and variance hemispheres.
Our Hemisphere Neural Network (HNN) provides proactive volatility forecasts based on leading indicators when it can, and reactive volatility based on the magnitude of previous prediction errors when it must.
arXiv Detail & Related papers (2023-11-27T21:37:50Z) - Pushing the Limits of Pre-training for Time Series Forecasting in the
CloudOps Domain [54.67888148566323]
We introduce three large-scale time series forecasting datasets from the cloud operations domain.
We show it is a strong zero-shot baseline and benefits from further scaling, both in model and dataset size.
Accompanying these datasets and results is a suite of comprehensive benchmark results comparing classical and deep learning baselines to our pre-trained method.
arXiv Detail & Related papers (2023-10-08T08:09:51Z) - Towards a Prediction of Machine Learning Training Time to Support
Continuous Learning Systems Development [5.207307163958806]
We present an empirical study of the Full.
Time Complexity (FPTC) approach by Zheng et al.
We study the formulations proposed for the Logistic Regression and Random Forest classifiers.
We observe how, from the conducted study, the prediction of training time is strictly related to the context.
arXiv Detail & Related papers (2023-09-20T11:35:03Z) - Deep learning models for price forecasting of financial time series: A
review of recent advancements: 2020-2022 [6.05458608266581]
Deep learning models are replacing traditional statistical and machine learning models for price forecasting tasks.
This review delves deeply into deep learning-based forecasting models, presenting information on model architectures, practical applications, and their respective advantages and disadvantages.
The present contribution also includes potential directions for future research, such as examining the effectiveness of deep learning models with complex structures for price forecasting.
arXiv Detail & Related papers (2023-04-21T03:46:09Z) - Integrating Local Real Data with Global Gradient Prototypes for
Classifier Re-Balancing in Federated Long-Tailed Learning [60.41501515192088]
Federated Learning (FL) has become a popular distributed learning paradigm that involves multiple clients training a global model collaboratively.
The data samples usually follow a long-tailed distribution in the real world, and FL on the decentralized and long-tailed data yields a poorly-behaved global model.
In this work, we integrate the local real data with the global gradient prototypes to form the local balanced datasets.
arXiv Detail & Related papers (2023-01-25T03:18:10Z) - DeepVol: Volatility Forecasting from High-Frequency Data with Dilated Causal Convolutions [53.37679435230207]
We propose DeepVol, a model based on Dilated Causal Convolutions that uses high-frequency data to forecast day-ahead volatility.
Our empirical results suggest that the proposed deep learning-based approach effectively learns global features from high-frequency data.
arXiv Detail & Related papers (2022-09-23T16:13:47Z) - Real-time Forecasting of Time Series in Financial Markets Using
Sequentially Trained Many-to-one LSTMs [0.304585143845864]
We train two LSTMs with a known length, say $T$ time steps, of previous data and predict only one time step ahead.
While one LSTM is employed to find the best number of epochs, the second LSTM is trained only for the best number of epochs to make predictions.
We treat the current prediction as in the training set for the next prediction and train the same LSTM.
arXiv Detail & Related papers (2022-05-10T05:18:45Z) - DRFLM: Distributionally Robust Federated Learning with Inter-client
Noise via Local Mixup [58.894901088797376]
federated learning has emerged as a promising approach for training a global model using data from multiple organizations without leaking their raw data.
We propose a general framework to solve the above two challenges simultaneously.
We provide comprehensive theoretical analysis including robustness analysis, convergence analysis, and generalization ability.
arXiv Detail & Related papers (2022-04-16T08:08:29Z) - Back2Future: Leveraging Backfill Dynamics for Improving Real-time
Predictions in Future [73.03458424369657]
In real-time forecasting in public health, data collection is a non-trivial and demanding task.
'Backfill' phenomenon and its effect on model performance has been barely studied in the prior literature.
We formulate a novel problem and neural framework Back2Future that aims to refine a given model's predictions in real-time.
arXiv Detail & Related papers (2021-06-08T14:48:20Z) - A Deep Learning Framework for Predicting Digital Asset Price Movement
from Trade-by-trade Data [20.392440676633573]
This paper presents a framework that predicts price movement of cryptocurrencies from trade-by-trade data.
The model is trained to achieve high performance on nearly a year of trade-by-trade data.
In a realistic trading simulation setting, the prediction made by the model could be easily monetized.
arXiv Detail & Related papers (2020-10-11T10:42:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.