Learning Non-Stationary Time-Series with Dynamic Pattern Extractions
- URL: http://arxiv.org/abs/2111.10559v1
- Date: Sat, 20 Nov 2021 10:52:37 GMT
- Title: Learning Non-Stationary Time-Series with Dynamic Pattern Extractions
- Authors: Xipei Wang, Haoyu Zhang, Yuanbo Zhang, Meng Wang, Jiarui Song, Tin
Lai, Matloob Khushi
- Abstract summary: State-of-the-art algorithms have achieved a decent performance in dealing with stationary temporal data.
Traditional algorithms that tackle stationary time-series do not apply to non-stationary series like Forex trading.
This paper investigates applicable models that can improve the accuracy of forecasting future trends of non-stationary time-series sequences.
- Score: 16.19692047595777
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The era of information explosion had prompted the accumulation of a
tremendous amount of time-series data, including stationary and non-stationary
time-series data. State-of-the-art algorithms have achieved a decent
performance in dealing with stationary temporal data. However, traditional
algorithms that tackle stationary time-series do not apply to non-stationary
series like Forex trading. This paper investigates applicable models that can
improve the accuracy of forecasting future trends of non-stationary time-series
sequences. In particular, we focus on identifying potential models and
investigate the effects of recognizing patterns from historical data. We
propose a combination of \rebuttal{the} seq2seq model based on RNN, along with
an attention mechanism and an enriched set features extracted via dynamic time
warping and zigzag peak valley indicators. Customized loss functions and
evaluating metrics have been designed to focus more on the predicting
sequence's peaks and valley points. Our results show that our model can predict
4-hour future trends with high accuracy in the Forex dataset, which is crucial
in realistic scenarios to assist foreign exchange trading decision making. We
further provide evaluations of the effects of various loss functions,
evaluation metrics, model variants, and components on model performance.
Related papers
- TimeInf: Time Series Data Contribution via Influence Functions [8.018453062120916]
TimeInf is a data contribution estimation method for time-series datasets.
Our empirical results demonstrate that TimeInf outperforms state-of-the-art methods in identifying harmful anomalies.
TimeInf offers intuitive and interpretable attributions of data values, allowing us to easily distinguish diverse anomaly patterns through visualizations.
arXiv Detail & Related papers (2024-07-21T19:10:40Z) - Learning Graph Structures and Uncertainty for Accurate and Calibrated Time-series Forecasting [65.40983982856056]
We introduce STOIC, that leverages correlations between time-series to learn underlying structure between time-series and to provide well-calibrated and accurate forecasts.
Over a wide-range of benchmark datasets STOIC provides 16% more accurate and better-calibrated forecasts.
arXiv Detail & Related papers (2024-07-02T20:14:32Z) - Probing the Robustness of Time-series Forecasting Models with
CounterfacTS [1.823020744088554]
We present and publicly release CounterfacTS, a tool to probe the robustness of deep learning models in time-series forecasting tasks.
CounterfacTS has a user-friendly interface that allows the user to visualize, compare and quantify time series data and their forecasts.
arXiv Detail & Related papers (2024-03-06T07:34:47Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations [15.797295258800638]
We introduce a novel modeling approach for time series imputation and forecasting, tailored to address the challenges often encountered in real-world data.
Our method relies on a continuous-time-dependent model of the series' evolution dynamics.
A modulation mechanism, driven by a meta-learning algorithm, allows adaptation to unseen samples and extrapolation beyond observed time-windows.
arXiv Detail & Related papers (2023-06-09T13:20:04Z) - An End-to-End Time Series Model for Simultaneous Imputation and Forecast [14.756607742477252]
We develop an end-to-end time series model that aims to learn the inference relation and make a multiple-step ahead forecast.
Our framework trains jointly two neural networks, one to learn the feature-wise correlations and the other for the modeling of temporal behaviors.
arXiv Detail & Related papers (2023-06-01T15:08:22Z) - DeepVol: Volatility Forecasting from High-Frequency Data with Dilated Causal Convolutions [53.37679435230207]
We propose DeepVol, a model based on Dilated Causal Convolutions that uses high-frequency data to forecast day-ahead volatility.
Our empirical results suggest that the proposed deep learning-based approach effectively learns global features from high-frequency data.
arXiv Detail & Related papers (2022-09-23T16:13:47Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Back2Future: Leveraging Backfill Dynamics for Improving Real-time
Predictions in Future [73.03458424369657]
In real-time forecasting in public health, data collection is a non-trivial and demanding task.
'Backfill' phenomenon and its effect on model performance has been barely studied in the prior literature.
We formulate a novel problem and neural framework Back2Future that aims to refine a given model's predictions in real-time.
arXiv Detail & Related papers (2021-06-08T14:48:20Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.