HERMES: Hybrid Error-corrector Model with inclusion of External Signals
for nonstationary fashion time series
- URL: http://arxiv.org/abs/2202.03224v3
- Date: Mon, 11 Sep 2023 09:06:28 GMT
- Title: HERMES: Hybrid Error-corrector Model with inclusion of External Signals
for nonstationary fashion time series
- Authors: Etienne David (TIPIC-SAMOVAR), Jean Bellot, Sylvain Le Corff (IP
Paris)
- Abstract summary: We propose a new model for fashion time series forecasting.
By tracking thousands of fashion trends on social media with state-of-the-art computer vision approaches, we propose a new model for fashion time series forecasting.
This hybrid model provides state-of-the-art results on the proposed fashion dataset, on the weekly time series of the M4 competition, and illustrates the benefit of the contribution of external weak signals.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Developing models and algorithms to predict nonstationary time series is a
long standing statistical problem. It is crucial for many applications, in
particular for fashion or retail industries, to make optimal inventory
decisions and avoid massive wastes. By tracking thousands of fashion trends on
social media with state-of-the-art computer vision approaches, we propose a new
model for fashion time series forecasting. Our contribution is twofold. We
first provide publicly a dataset gathering 10000 weekly fashion time series. As
influence dynamics are the key of emerging trend detection, we associate with
each time series an external weak signal representing behaviours of
influencers. Secondly, to leverage such a dataset, we propose a new hybrid
forecasting model. Our approach combines per-time-series parametric models with
seasonal components and a global recurrent neural network to include sporadic
external signals. This hybrid model provides state-of-the-art results on the
proposed fashion dataset, on the weekly time series of the M4 competition, and
illustrates the benefit of the contribution of external weak signals.
Related papers
- Variational quantization for state space models [3.9762742923544456]
forecasting tasks using large datasets gathering thousands of heterogeneous time series is a crucial statistical problem in numerous sectors.
We propose a new forecasting model that combines discrete state space hidden Markov models with recent neural network architectures and training procedures inspired by vector quantized variational autoencoders.
We assess the performance of the proposed method using several datasets and show that it outperforms other state-of-the-art solutions.
arXiv Detail & Related papers (2024-04-17T07:01:41Z) - Recency-Weighted Temporally-Segmented Ensemble for Time-Series Modeling [0.0]
Time-series modeling in process industries faces the challenge of dealing with complex, multi-faceted, and evolving data characteristics.
We introduce the Recency-Weighted Temporally-Segmented (ReWTS) ensemble model, a novel chunk-based approach for multi-step forecasting.
We present a comparative analysis, utilizing two years of data from a wastewater treatment plant and a drinking water treatment plant in Norway.
arXiv Detail & Related papers (2024-03-04T16:00:35Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - G-NM: A Group of Numerical Time Series Prediction Models [0.0]
Group of Numerical Time Series Prediction Model (G-NM) encapsulates both linear and non-linear dependencies, seasonalities, and trends present in time series data.
G-NM is explicitly constructed to augment our predictive capabilities related to patterns and trends inherent in complex natural phenomena.
arXiv Detail & Related papers (2023-06-20T16:39:27Z) - Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations [15.797295258800638]
We introduce a novel modeling approach for time series imputation and forecasting, tailored to address the challenges often encountered in real-world data.
Our method relies on a continuous-time-dependent model of the series' evolution dynamics.
A modulation mechanism, driven by a meta-learning algorithm, allows adaptation to unseen samples and extrapolation beyond observed time-windows.
arXiv Detail & Related papers (2023-06-09T13:20:04Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - Leveraging Multiple Relations for Fashion Trend Forecasting Based on
Social Media [72.06420633156479]
We propose an improved model named Relation Enhanced Attention Recurrent (REAR) network.
Compared to KERN, the REAR model leverages not only the relations among fashion elements but also those among user groups.
To further improve the performance of long-range trend forecasting, the REAR method devises a sliding temporal attention mechanism.
arXiv Detail & Related papers (2021-05-07T14:52:03Z) - Model-Attentive Ensemble Learning for Sequence Modeling [86.4785354333566]
We present Model-Attentive Ensemble learning for Sequence modeling (MAES)
MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions.
We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.
arXiv Detail & Related papers (2021-02-23T05:23:35Z) - Global Models for Time Series Forecasting: A Simulation Study [2.580765958706854]
We simulate time series from simple data generating processes (DGP), such as Auto Regressive (AR) and Seasonal AR, to complex DGPs, such as Chaotic Logistic Map, Self-Exciting Threshold Auto-Regressive, and Mackey-Glass equations.
The lengths and the number of series in the dataset are varied in different scenarios.
We perform experiments on these datasets using global forecasting models including Recurrent Neural Networks (RNN), Feed-Forward Neural Networks, Pooled Regression (PR) models, and Light Gradient Boosting Models (LGBM)
arXiv Detail & Related papers (2020-12-23T04:45:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.