Analysis and modeling to forecast in time series: a systematic review
- URL: http://arxiv.org/abs/2104.00164v1
- Date: Wed, 31 Mar 2021 23:48:46 GMT
- Title: Analysis and modeling to forecast in time series: a systematic review
- Authors: Fatoumata Dama, Christine Sinoquet
- Abstract summary: This paper surveys state-of-the-art methods and models dedicated to time series analysis and modeling, with the final aim of prediction.
This review aims to offer a structured and comprehensive view of the full process flow, and encompasses time series decomposition, stationary tests, modeling and forecasting.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This paper surveys state-of-the-art methods and models dedicated to time
series analysis and modeling, with the final aim of prediction. This review
aims to offer a structured and comprehensive view of the full process flow, and
encompasses time series decomposition, stationary tests, modeling and
forecasting. Besides, to meet didactic purposes, a unified presentation has
been adopted throughout this survey, to present decomposition frameworks on the
one hand and linear and nonlinear time series models on the other hand. First,
we decrypt the relationships between stationarity and linearity, and further
examine the main classes of methods used to test for weak stationarity. Next,
the main frameworks for time series decomposition are presented in a unified
way: depending on the time series, a more or less complex decomposition scheme
seeks to obtain nonstationary effects (the deterministic components) and a
remaining stochastic component. An appropriate modeling of the latter is a
critical step to guarantee prediction accuracy. We then present three popular
linear models, together with two more flexible variants of the latter. A step
further in model complexity, and still in a unified way, we present five major
nonlinear models used for time series. Amongst nonlinear models, artificial
neural networks hold a place apart as deep learning has recently gained
considerable attention. A whole section is therefore dedicated to time series
forecasting relying on deep learning approaches. A final section provides a
list of R and Python implementations for the methods, models and tests
presented throughout this review. In this document, our intention is to bring
sufficient in-depth knowledge, while covering a broad range of models and
forecasting methods: this compilation spans from well-established conventional
approaches to more recent adaptations of deep learning to time series
forecasting.
Related papers
- Recurrent Neural Goodness-of-Fit Test for Time Series [8.22915954499148]
Time series data are crucial across diverse domains such as finance and healthcare.
Traditional evaluation metrics fall short due to the temporal dependencies and potential high dimensionality of the features.
We propose the REcurrent NeurAL (RENAL) Goodness-of-Fit test, a novel and statistically rigorous framework for evaluating generative time series models.
arXiv Detail & Related papers (2024-10-17T19:32:25Z) - Deep Time Series Models: A Comprehensive Survey and Benchmark [74.28364194333447]
Time series data is of great significance in real-world scenarios.
Recent years have witnessed remarkable breakthroughs in the time series community.
We release Time Series Library (TSLib) as a fair benchmark of deep time series models for diverse analysis tasks.
arXiv Detail & Related papers (2024-07-18T08:31:55Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations [15.797295258800638]
We introduce a novel modeling approach for time series imputation and forecasting, tailored to address the challenges often encountered in real-world data.
Our method relies on a continuous-time-dependent model of the series' evolution dynamics.
A modulation mechanism, driven by a meta-learning algorithm, allows adaptation to unseen samples and extrapolation beyond observed time-windows.
arXiv Detail & Related papers (2023-06-09T13:20:04Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Meta-Forecasting by combining Global DeepRepresentations with Local
Adaptation [12.747008878068314]
We introduce a novel forecasting method called Meta Global-Local Auto-Regression (Meta-GLAR)
It adapts to each time series by learning in closed-form the mapping from the representations produced by a recurrent neural network (RNN) to one-step-ahead forecasts.
Our method is competitive with the state-of-the-art in out-of-sample forecasting accuracy reported in earlier work.
arXiv Detail & Related papers (2021-11-05T11:45:02Z) - Randomized Neural Networks for Forecasting Time Series with Multiple
Seasonality [0.0]
This work contributes to the development of neural forecasting models with novel randomization-based learning methods.
A pattern-based representation of time series makes the proposed approach useful for forecasting time series with multiple seasonality.
arXiv Detail & Related papers (2021-07-04T18:39:27Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Time Adaptive Gaussian Model [0.913755431537592]
Our model is a generalization of state-of-the-art methods for the inference of temporal graphical models.
It performs pattern recognition by clustering data points in time; and, it finds probabilistic (and possibly causal) relationships among the observed variables.
arXiv Detail & Related papers (2021-02-02T00:28:14Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.