Intermittent Demand Forecasting with Renewal Processes
- URL: http://arxiv.org/abs/2010.01550v1
- Date: Sun, 4 Oct 2020 11:32:54 GMT
- Title: Intermittent Demand Forecasting with Renewal Processes
- Authors: Ali Caner Turkmen, Tim Januschowski, Yuyang Wang and Ali Taylan Cemgil
- Abstract summary: We introduce a new, unified framework for building intermittent demand forecasting models.
Our framework is based on extensions of well-established model-based methods to discrete-time renewal processes.
We report predictive accuracy in a variety of scenarios that compares favorably to the state of the art.
- Score: 16.079036729678716
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Intermittency is a common and challenging problem in demand forecasting. We
introduce a new, unified framework for building intermittent demand forecasting
models, which incorporates and allows to generalize existing methods in several
directions. Our framework is based on extensions of well-established
model-based methods to discrete-time renewal processes, which can
parsimoniously account for patterns such as aging, clustering and
quasi-periodicity in demand arrivals. The connection to discrete-time renewal
processes allows not only for a principled extension of Croston-type models,
but also for an natural inclusion of neural network based models---by replacing
exponential smoothing with a recurrent neural network. We also demonstrate that
modeling continuous-time demand arrivals, i.e., with a temporal point process,
is possible via a trivial extension of our framework. This leads to more
flexible modeling in scenarios where data of individual purchase orders are
directly available with granular timestamps. Complementing this theoretical
advancement, we demonstrate the efficacy of our framework for forecasting
practice via an extensive empirical study on standard intermittent demand data
sets, in which we report predictive accuracy in a variety of scenarios that
compares favorably to the state of the art.
Related papers
- Recurrent Neural Goodness-of-Fit Test for Time Series [8.22915954499148]
Time series data are crucial across diverse domains such as finance and healthcare.
Traditional evaluation metrics fall short due to the temporal dependencies and potential high dimensionality of the features.
We propose the REcurrent NeurAL (RENAL) Goodness-of-Fit test, a novel and statistically rigorous framework for evaluating generative time series models.
arXiv Detail & Related papers (2024-10-17T19:32:25Z) - A Survey on Diffusion Models for Time Series and Spatio-Temporal Data [92.1255811066468]
We review the use of diffusion models in time series and S-temporal data, categorizing them by model, task type, data modality, and practical application domain.
We categorize diffusion models into unconditioned and conditioned types discuss time series and S-temporal data separately.
Our survey covers their application extensively in various fields including healthcare, recommendation, climate, energy, audio, and transportation.
arXiv Detail & Related papers (2024-04-29T17:19:40Z) - On the Efficient Marginalization of Probabilistic Sequence Models [3.5897534810405403]
This dissertation focuses on using autoregressive models to answer complex probabilistic queries.
We develop a class of novel and efficient approximation techniques for marginalization in sequential models that are model-agnostic.
arXiv Detail & Related papers (2024-03-06T19:29:08Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations [15.797295258800638]
We introduce a novel modeling approach for time series imputation and forecasting, tailored to address the challenges often encountered in real-world data.
Our method relies on a continuous-time-dependent model of the series' evolution dynamics.
A modulation mechanism, driven by a meta-learning algorithm, allows adaptation to unseen samples and extrapolation beyond observed time-windows.
arXiv Detail & Related papers (2023-06-09T13:20:04Z) - Temporal Relevance Analysis for Video Action Models [70.39411261685963]
We first propose a new approach to quantify the temporal relationships between frames captured by CNN-based action models.
We then conduct comprehensive experiments and in-depth analysis to provide a better understanding of how temporal modeling is affected.
arXiv Detail & Related papers (2022-04-25T19:06:48Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z) - Variational Conditional Dependence Hidden Markov Models for
Skeleton-Based Action Recognition [7.9603223299524535]
This paper revisits conventional sequential modeling approaches, aiming to address the problem of capturing time-varying temporal dependency patterns.
We propose a different formulation of HMMs, whereby the dependence on past frames is dynamically inferred from the data.
We derive a tractable inference algorithm based on the forward-backward algorithm.
arXiv Detail & Related papers (2020-02-13T23:18:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.