OFTER: An Online Pipeline for Time Series Forecasting
- URL: http://arxiv.org/abs/2304.03877v1
- Date: Sat, 8 Apr 2023 00:18:03 GMT
- Title: OFTER: An Online Pipeline for Time Series Forecasting
- Authors: Nikolas Michael, Mihai Cucuringu, Sam Howison
- Abstract summary: OFTER is a time series forecasting pipeline tailored for mid-sized multivariate time series.
It is specifically designed for online tasks, has an interpretable output, and is able to outperform several state-of-the art baselines.
The computational efficacy of the algorithm, its online nature, and its ability to operate in low signal-to-noise regimes render OFTER an ideal approach for financial time series problems.
- Score: 3.9962751777898955
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce OFTER, a time series forecasting pipeline tailored for mid-sized
multivariate time series. OFTER utilizes the non-parametric models of k-nearest
neighbors and Generalized Regression Neural Networks, integrated with a
dimensionality reduction component. To circumvent the curse of dimensionality,
we employ a weighted norm based on a modified version of the maximal
correlation coefficient. The pipeline we introduce is specifically designed for
online tasks, has an interpretable output, and is able to outperform several
state-of-the art baselines. The computational efficacy of the algorithm, its
online nature, and its ability to operate in low signal-to-noise regimes,
render OFTER an ideal approach for financial multivariate time series problems,
such as daily equity forecasting. Our work demonstrates that while deep
learning models hold significant promise for time series forecasting,
traditional methods carefully integrating mainstream tools remain very
competitive alternatives with the added benefits of scalability and
interpretability.
Related papers
- UmambaTSF: A U-shaped Multi-Scale Long-Term Time Series Forecasting Method Using Mamba [7.594115034632109]
We propose UmambaTSF, a novel long-term time series forecasting framework.
It integrates multi-scale feature extraction capabilities of U-shaped encoder-decoder multilayer perceptrons (MLP) with Mamba's long sequence representation.
UmambaTSF achieves state-of-the-art performance and excellent generality on widely used benchmark datasets.
arXiv Detail & Related papers (2024-10-15T04:56:43Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Reprogramming Foundational Large Language Models(LLMs) for Enterprise Adoption for Spatio-Temporal Forecasting Applications: Unveiling a New Era in Copilot-Guided Cross-Modal Time Series Representation Learning [0.0]
patio-temporal forecasting plays a crucial role in various sectors such as transportation systems, logistics, and supply chain management.
We introduce a hybrid approach that combines the strengths of open-source large and small-scale language models (LLMs and LMs) with traditional forecasting methods.
arXiv Detail & Related papers (2024-08-26T16:11:53Z) - Advancing Enterprise Spatio-Temporal Forecasting Applications: Data Mining Meets Instruction Tuning of Language Models For Multi-modal Time Series Analysis in Low-Resource Settings [0.0]
patio-temporal forecasting is crucial in transportation, logistics, and supply chain management.
We propose a dynamic, multi-modal approach that integrates the strengths of traditional forecasting methods and instruction tuning of small language models.
Our framework enables on-premises customization with reduced computational and memory demands, while maintaining inference speed and data privacy/security.
arXiv Detail & Related papers (2024-08-24T16:32:58Z) - MGCP: A Multi-Grained Correlation based Prediction Network for Multivariate Time Series [54.91026286579748]
We propose a Multi-Grained Correlations-based Prediction Network.
It simultaneously considers correlations at three levels to enhance prediction performance.
It employs adversarial training with an attention mechanism-based predictor and conditional discriminator to optimize prediction results at coarse-grained level.
arXiv Detail & Related papers (2024-05-30T03:32:44Z) - TACTiS-2: Better, Faster, Simpler Attentional Copulas for Multivariate Time Series [57.4208255711412]
Building on copula theory, we propose a simplified objective for the recently-introduced transformer-based attentional copulas (TACTiS)
We show that the resulting model has significantly better training dynamics and achieves state-of-the-art performance across diverse real-world forecasting tasks.
arXiv Detail & Related papers (2023-10-02T16:45:19Z) - OneNet: Enhancing Time Series Forecasting Models under Concept Drift by
Online Ensembling [65.93805881841119]
We propose textbfOnline textbfensembling textbfNetwork (OneNet) to address the concept drifting problem.
OneNet reduces online forecasting error by more than $mathbf50%$ compared to the State-Of-The-Art (SOTA) method.
arXiv Detail & Related papers (2023-09-22T06:59:14Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series [77.47313102926017]
Radflow is a novel model for networks of time series that influence each other.
It embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series.
We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength.
arXiv Detail & Related papers (2021-02-15T00:57:28Z) - Temporal Latent Auto-Encoder: A Method for Probabilistic Multivariate
Time Series Forecasting [4.131842516813833]
We introduce a novel temporal latent auto-encoder method which enables nonlinear factorization of time series.
By imposing a probabilistic latent space model, complex distributions of the input series are modeled via the decoder.
Our model achieves state-of-the-art performance on many popular multivariate datasets, with gains sometimes as high as $50%$ for several standard metrics.
arXiv Detail & Related papers (2021-01-25T22:29:40Z) - Multivariate Probabilistic Time Series Forecasting via Conditioned
Normalizing Flows [8.859284959951204]
Time series forecasting is fundamental to scientific and engineering problems.
Deep learning methods are well suited for this problem.
We show that it improves over the state-of-the-art for standard metrics on many real-world data sets.
arXiv Detail & Related papers (2020-02-14T16:16:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.