Koopman Ensembles for Probabilistic Time Series Forecasting
- URL: http://arxiv.org/abs/2403.06757v2
- Date: Wed, 13 Mar 2024 13:57:42 GMT
- Title: Koopman Ensembles for Probabilistic Time Series Forecasting
- Authors: Anthony Frion, Lucas Drumetz, Guillaume Tochon, Mauro Dalla Mura,
Albdeldjalil A\"issa El Bey
- Abstract summary: We show that ensembles of independently trained models are highly overconfident and that using a training criterion that explicitly encourages the members to produce predictions with high inter-model variances greatly improves the uncertainty of the ensembles.
- Score: 6.699751896019971
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the context of an increasing popularity of data-driven models to represent
dynamical systems, many machine learning-based implementations of the Koopman
operator have recently been proposed. However, the vast majority of those works
are limited to deterministic predictions, while the knowledge of uncertainty is
critical in fields like meteorology and climatology. In this work, we
investigate the training of ensembles of models to produce stochastic outputs.
We show through experiments on real remote sensing image time series that
ensembles of independently trained models are highly overconfident and that
using a training criterion that explicitly encourages the members to produce
predictions with high inter-model variances greatly improves the uncertainty
quantification of the ensembles.
Related papers
- Dynamic Post-Hoc Neural Ensemblers [55.15643209328513]
In this study, we explore employing neural networks as ensemble methods.
Motivated by the risk of learning low-diversity ensembles, we propose regularizing the model by randomly dropping base model predictions.
We demonstrate this approach lower bounds the diversity within the ensemble, reducing overfitting and improving generalization capabilities.
arXiv Detail & Related papers (2024-10-06T15:25:39Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - MECATS: Mixture-of-Experts for Quantile Forecasts of Aggregated Time
Series [11.826510794042548]
We introduce a mixture of heterogeneous experts framework called textttMECATS.
It simultaneously forecasts the values of a set of time series that are related through an aggregation hierarchy.
Different types of forecasting models can be employed as individual experts so that the form of each model can be tailored to the nature of the corresponding time series.
arXiv Detail & Related papers (2021-12-22T05:05:30Z) - Ensembles of Randomized NNs for Pattern-based Time Series Forecasting [0.0]
We propose an ensemble forecasting approach based on randomized neural networks.
A pattern-based representation of time series makes the proposed approach suitable for forecasting time series with multiple seasonality.
Case studies conducted on four real-world forecasting problems verified the effectiveness and superior performance of the proposed ensemble forecasting approach.
arXiv Detail & Related papers (2021-07-08T20:13:50Z) - Randomized Neural Networks for Forecasting Time Series with Multiple
Seasonality [0.0]
This work contributes to the development of neural forecasting models with novel randomization-based learning methods.
A pattern-based representation of time series makes the proposed approach useful for forecasting time series with multiple seasonality.
arXiv Detail & Related papers (2021-07-04T18:39:27Z) - Simultaneously Reconciled Quantile Forecasting of Hierarchically Related
Time Series [11.004159006784977]
We propose a flexible nonlinear model that optimize quantile regression loss coupled with suitable regularization terms to maintain consistency of forecasts across hierarchies.
The theoretical framework introduced herein can be applied to any forecasting model with an underlying differentiable loss function.
arXiv Detail & Related papers (2021-02-25T00:59:01Z) - Model-Attentive Ensemble Learning for Sequence Modeling [86.4785354333566]
We present Model-Attentive Ensemble learning for Sequence modeling (MAES)
MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions.
We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.
arXiv Detail & Related papers (2021-02-23T05:23:35Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - Temporal Latent Auto-Encoder: A Method for Probabilistic Multivariate
Time Series Forecasting [4.131842516813833]
We introduce a novel temporal latent auto-encoder method which enables nonlinear factorization of time series.
By imposing a probabilistic latent space model, complex distributions of the input series are modeled via the decoder.
Our model achieves state-of-the-art performance on many popular multivariate datasets, with gains sometimes as high as $50%$ for several standard metrics.
arXiv Detail & Related papers (2021-01-25T22:29:40Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.