Enhancing Mean-Reverting Time Series Prediction with Gaussian Processes:
Functional and Augmented Data Structures in Financial Forecasting
- URL: http://arxiv.org/abs/2403.00796v1
- Date: Fri, 23 Feb 2024 06:09:45 GMT
- Title: Enhancing Mean-Reverting Time Series Prediction with Gaussian Processes:
Functional and Augmented Data Structures in Financial Forecasting
- Authors: Narayan Tondapu
- Abstract summary: We explore the application of Gaussian Processes (GPs) for predicting mean-reverting time series with an underlying structure.
GPs offer the potential to forecast not just the average prediction but the entire probability distribution over a future trajectory.
This is particularly beneficial in financial contexts, where accurate predictions alone may not suffice if incorrect volatility assessments lead to capital losses.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In this paper, we explore the application of Gaussian Processes (GPs) for
predicting mean-reverting time series with an underlying structure, using
relatively unexplored functional and augmented data structures. While many
conventional forecasting methods concentrate on the short-term dynamics of time
series data, GPs offer the potential to forecast not just the average
prediction but the entire probability distribution over a future trajectory.
This is particularly beneficial in financial contexts, where accurate
predictions alone may not suffice if incorrect volatility assessments lead to
capital losses. Moreover, in trade selection, GPs allow for the forecasting of
multiple Sharpe ratios adjusted for transaction costs, aiding in
decision-making. The functional data representation utilized in this study
enables longer-term predictions by leveraging information from previous years,
even as the forecast moves away from the current year's training data.
Additionally, the augmented representation enriches the training set by
incorporating multiple targets for future points in time, facilitating
long-term predictions. Our implementation closely aligns with the methodology
outlined in, which assessed effectiveness on commodity futures. However, our
testing methodology differs. Instead of real data, we employ simulated data
with similar characteristics. We construct a testing environment to evaluate
both data representations and models under conditions of increasing noise, fat
tails, and inappropriate kernels-conditions commonly encountered in practice.
By simulating data, we can compare our forecast distribution over time against
a full simulation of the actual distribution of our test set, thereby reducing
the inherent uncertainty in testing time series models on real data. We enable
feature prediction through augmentation and employ sub-sampling to ensure the
feasibility of GPs.
Related papers
- Future-Guided Learning: A Predictive Approach To Enhance Time-Series Forecasting [4.866362841501992]
We introduce Future-Guided Learning, an approach that enhances time-series event forecasting.
Our approach involves two models: a detection model that analyzes future data to identify critical events and a forecasting model that predicts these events based on present data.
When discrepancies arise between the forecasting and detection models, the forecasting model undergoes more substantial updates.
arXiv Detail & Related papers (2024-10-19T21:22:55Z) - F-FOMAML: GNN-Enhanced Meta-Learning for Peak Period Demand Forecasting with Proxy Data [65.6499834212641]
We formulate the demand prediction as a meta-learning problem and develop the Feature-based First-Order Model-Agnostic Meta-Learning (F-FOMAML) algorithm.
By considering domain similarities through task-specific metadata, our model improved generalization, where the excess risk decreases as the number of training tasks increases.
Compared to existing state-of-the-art models, our method demonstrates a notable improvement in demand prediction accuracy, reducing the Mean Absolute Error by 26.24% on an internal vending machine dataset and by 1.04% on the publicly accessible JD.com dataset.
arXiv Detail & Related papers (2024-06-23T21:28:50Z) - Stock Volume Forecasting with Advanced Information by Conditional Variational Auto-Encoder [49.97673761305336]
We demonstrate the use of Conditional Variational (CVAE) to improve the forecasts of daily stock volume time series in both short and long term forecasting tasks.
CVAE generates non-linear time series as out-of-sample forecasts, which have better accuracy and closer fit of correlation to the actual data.
arXiv Detail & Related papers (2024-06-19T13:13:06Z) - ForecastPFN: Synthetically-Trained Zero-Shot Forecasting [16.12148632541671]
ForecastPFN is the first zero-shot forecasting model trained purely on a novel synthetic data distribution.
We show that zero-shot predictions made by ForecastPFN are more accurate and faster compared to state-of-the-art forecasting methods.
arXiv Detail & Related papers (2023-11-03T14:17:11Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Performative Time-Series Forecasting [71.18553214204978]
We formalize performative time-series forecasting (PeTS) from a machine-learning perspective.
We propose a novel approach, Feature Performative-Shifting (FPS), which leverages the concept of delayed response to anticipate distribution shifts.
We conduct comprehensive experiments using multiple time-series models on COVID-19 and traffic forecasting tasks.
arXiv Detail & Related papers (2023-10-09T18:34:29Z) - DeepVol: Volatility Forecasting from High-Frequency Data with Dilated Causal Convolutions [53.37679435230207]
We propose DeepVol, a model based on Dilated Causal Convolutions that uses high-frequency data to forecast day-ahead volatility.
Our empirical results suggest that the proposed deep learning-based approach effectively learns global features from high-frequency data.
arXiv Detail & Related papers (2022-09-23T16:13:47Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Probabilistic AutoRegressive Neural Networks for Accurate Long-range
Forecasting [6.295157260756792]
We introduce the Probabilistic AutoRegressive Neural Networks (PARNN)
PARNN is capable of handling complex time series data exhibiting non-stationarity, nonlinearity, non-seasonality, long-range dependence, and chaotic patterns.
We evaluate the performance of PARNN against standard statistical, machine learning, and deep learning models, including Transformers, NBeats, and DeepAR.
arXiv Detail & Related papers (2022-04-01T17:57:36Z) - RNN with Particle Flow for Probabilistic Spatio-temporal Forecasting [30.277213545837924]
Many classical statistical models often fall short in handling the complexity and high non-linearity present in time-series data.
In this work, we consider the time-series data as a random realization from a nonlinear state-space model.
We use particle flow as the tool for approximating the posterior distribution of the states, as it is shown to be highly effective in complex, high-dimensional settings.
arXiv Detail & Related papers (2021-06-10T21:49:23Z) - Robust Validation: Confident Predictions Even When Distributions Shift [19.327409270934474]
We describe procedures for robust predictive inference, where a model provides uncertainty estimates on its predictions rather than point predictions.
We present a method that produces prediction sets (almost exactly) giving the right coverage level for any test distribution in an $f$-divergence ball around the training population.
An essential component of our methodology is to estimate the amount of expected future data shift and build robustness to it.
arXiv Detail & Related papers (2020-08-10T17:09:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.