Time Series Analysis by State Space Learning
- URL: http://arxiv.org/abs/2408.09120v1
- Date: Sat, 17 Aug 2024 07:04:26 GMT
- Title: Time Series Analysis by State Space Learning
- Authors: André Ramos, Davi Valladão, Alexandre Street,
- Abstract summary: Time series analysis by state-space models is widely used in forecasting and extracting unobservable components like level slope, and seasonality, along with explanatory variables.
Our research introduces the State Space Learning (SSL), a novel framework and paradigm that leverages the capabilities of statistical learning to construct a comprehensive framework for time series modeling and forecasting.
- Score: 44.99833362998488
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Time series analysis by state-space models is widely used in forecasting and extracting unobservable components like level, slope, and seasonality, along with explanatory variables. However, their reliance on traditional Kalman filtering frequently hampers their effectiveness, primarily due to Gaussian assumptions and the absence of efficient subset selection methods to accommodate the multitude of potential explanatory variables in today's big-data applications. Our research introduces the State Space Learning (SSL), a novel framework and paradigm that leverages the capabilities of statistical learning to construct a comprehensive framework for time series modeling and forecasting. By utilizing a regularized high-dimensional regression framework, our approach jointly extracts typical time series unobservable components, detects and addresses outliers, and selects the influence of exogenous variables within a high-dimensional space in polynomial time and global optimality guarantees. Through a controlled numerical experiment, we demonstrate the superiority of our approach in terms of subset selection of explanatory variables accuracy compared to relevant benchmarks. We also present an intuitive forecasting scheme and showcase superior performances relative to traditional time series models using a dataset of 48,000 monthly time series from the M4 competition. We extend the applicability of our approach to reformulate any linear state space formulation featuring time-varying coefficients into high-dimensional regularized regressions, expanding the impact of our research to other engineering applications beyond time series analysis. Finally, our proposed methodology is implemented within the Julia open-source package, ``StateSpaceLearning.jl".
Related papers
- Advancing Enterprise Spatio-Temporal Forecasting Applications: Data Mining Meets Instruction Tuning of Language Models For Multi-modal Time Series Analysis in Low-Resource Settings [0.0]
patio-temporal forecasting is crucial in transportation, logistics, and supply chain management.
We propose a dynamic, multi-modal approach that integrates the strengths of traditional forecasting methods and instruction tuning of small language models.
Our framework enables on-premises customization with reduced computational and memory demands, while maintaining inference speed and data privacy/security.
arXiv Detail & Related papers (2024-08-24T16:32:58Z) - UniCL: A Universal Contrastive Learning Framework for Large Time Series Models [18.005358506435847]
Time-series analysis plays a pivotal role across a range of critical applications, from finance to healthcare.
Traditional supervised learning methods first annotate extensive labels for time-series data in each task.
This paper introduces UniCL, a universal and scalable contrastive learning framework designed for pretraining time-series foundation models.
arXiv Detail & Related papers (2024-05-17T07:47:11Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - Optimal Latent Space Forecasting for Large Collections of Short Time
Series Using Temporal Matrix Factorization [0.0]
It is a common practice to evaluate multiple methods and choose one of these methods or an ensemble for producing the best forecasts.
We propose a framework for forecasting short high-dimensional time series data by combining low-rank temporal matrix factorization and optimal model selection on latent time series.
arXiv Detail & Related papers (2021-12-15T11:39:21Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Time Series Forecasting Using Manifold Learning [6.316185724124034]
We address a three-tier numerical framework based on manifold learning for the forecasting of high-dimensional time series.
At the first step, we embed the time series into a reduced low-dimensional space using a nonlinear manifold learning algorithm.
At the second step, we construct reduced-order regression models on the manifold to forecast the embedded dynamics.
At the final step, we lift the embedded time series back to the original high-dimensional space.
arXiv Detail & Related papers (2021-10-07T17:09:59Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Multivariate Probabilistic Time Series Forecasting via Conditioned
Normalizing Flows [8.859284959951204]
Time series forecasting is fundamental to scientific and engineering problems.
Deep learning methods are well suited for this problem.
We show that it improves over the state-of-the-art for standard metrics on many real-world data sets.
arXiv Detail & Related papers (2020-02-14T16:16:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.