Time series forecasting with Gaussian Processes needs priors
- URL: http://arxiv.org/abs/2009.08102v2
- Date: Mon, 21 Jun 2021 10:10:03 GMT
- Title: Time series forecasting with Gaussian Processes needs priors
- Authors: Giorgio Corani, Alessio Benavoli, Marco Zaffalon
- Abstract summary: We propose an optimal kernel and reliable estimation of the hyper parameters.
We present results on many time series of different types; our GP model is more accurate than state-of-the-art time series models.
- Score: 1.5877673959068452
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automatic forecasting is the task of receiving a time series and returning a
forecast for the next time steps without any human intervention. Gaussian
Processes (GPs) are a powerful tool for modeling time series, but so far there
are no competitive approaches for automatic forecasting based on GPs. We
propose practical solutions to two problems: automatic selection of the optimal
kernel and reliable estimation of the hyperparameters. We propose a fixed
composition of kernels, which contains the components needed to model most time
series: linear trend, periodic patterns, and other flexible kernel for modeling
the non-linear trend. Not all components are necessary to model each time
series; during training the unnecessary components are automatically made
irrelevant via automatic relevance determination (ARD). We moreover assign
priors to the hyperparameters, in order to keep the inference within a
plausible range; we design such priors through an empirical Bayes approach. We
present results on many time series of different types; our GP model is more
accurate than state-of-the-art time series models. Thanks to the priors, a
single restart is enough the estimate the hyperparameters; hence the model is
also fast to train.
Related papers
- auto-sktime: Automated Time Series Forecasting [18.640815949661903]
We introduce auto-sktime, a novel framework for automated time series forecasting.
The proposed framework uses the power of automated machine learning (AutoML) techniques to automate the creation of the entire forecasting pipeline.
Experimental results on 64 diverse real-world time series datasets demonstrate the effectiveness and efficiency of the framework.
arXiv Detail & Related papers (2023-12-13T21:34:30Z) - Generalized Mixture Model for Extreme Events Forecasting in Time Series
Data [10.542258423966492]
Time Series Forecasting (TSF) is a widely researched topic with broad applications in weather forecasting, traffic control, and stock price prediction.
Extreme values in time series often significantly impact human and natural systems, but predicting them is challenging due to their rare occurrence.
We propose a novel framework to enhance the focus on extreme events. Specifically, we propose a Deep Extreme Mixture Model with Autoencoder (DEMMA) for time series prediction.
arXiv Detail & Related papers (2023-10-11T12:36:42Z) - TACTiS-2: Better, Faster, Simpler Attentional Copulas for Multivariate Time Series [57.4208255711412]
Building on copula theory, we propose a simplified objective for the recently-introduced transformer-based attentional copulas (TACTiS)
We show that the resulting model has significantly better training dynamics and achieves state-of-the-art performance across diverse real-world forecasting tasks.
arXiv Detail & Related papers (2023-10-02T16:45:19Z) - Precision-Recall Divergence Optimization for Generative Modeling with
GANs and Normalizing Flows [54.050498411883495]
We develop a novel training method for generative models, such as Generative Adversarial Networks and Normalizing Flows.
We show that achieving a specified precision-recall trade-off corresponds to minimizing a unique $f$-divergence from a family we call the textitPR-divergences.
Our approach improves the performance of existing state-of-the-art models like BigGAN in terms of either precision or recall when tested on datasets such as ImageNet.
arXiv Detail & Related papers (2023-05-30T10:07:17Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Model-based micro-data reinforcement learning: what are the crucial
model properties and which model to choose? [0.2836066255205732]
We contribute to micro-data model-based reinforcement learning (MBRL) by rigorously comparing popular generative models.
We find that on an environment that requires multimodal posterior predictives, mixture density nets outperform all other models by a large margin.
We also found that deterministic models are on par, in fact they consistently (although non-significantly) outperform their probabilistic counterparts.
arXiv Detail & Related papers (2021-07-24T11:38:25Z) - Randomized Neural Networks for Forecasting Time Series with Multiple
Seasonality [0.0]
This work contributes to the development of neural forecasting models with novel randomization-based learning methods.
A pattern-based representation of time series makes the proposed approach useful for forecasting time series with multiple seasonality.
arXiv Detail & Related papers (2021-07-04T18:39:27Z) - Autoformer: Decomposition Transformers with Auto-Correlation for
Long-Term Series Forecasting [68.86835407617778]
Autoformer is a novel decomposition architecture with an Auto-Correlation mechanism.
In long-term forecasting, Autoformer yields state-of-the-art accuracy, with a relative improvement on six benchmarks.
arXiv Detail & Related papers (2021-06-24T13:43:43Z) - ScoreGrad: Multivariate Probabilistic Time Series Forecasting with
Continuous Energy-based Generative Models [10.337742174633052]
We propose ScoreGrad, a probabilistic time series forecasting framework based on continuous energy-based generative models.
ScoreGrad is composed of time series feature extraction module and conditional differential equation based score matching module.
It achieves state-of-the-art results on six real-world datasets.
arXiv Detail & Related papers (2021-06-18T13:22:12Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Nonparametric Estimation in the Dynamic Bradley-Terry Model [69.70604365861121]
We develop a novel estimator that relies on kernel smoothing to pre-process the pairwise comparisons over time.
We derive time-varying oracle bounds for both the estimation error and the excess risk in the model-agnostic setting.
arXiv Detail & Related papers (2020-02-28T21:52:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.