Forecasting time series with encoder-decoder neural networks
- URL: http://arxiv.org/abs/2009.08848v1
- Date: Fri, 18 Sep 2020 14:07:38 GMT
- Title: Forecasting time series with encoder-decoder neural networks
- Authors: Nathawut Phandoidaen, Stefan Richter
- Abstract summary: We consider high-dimensional stationary processes where a new observation is generated from a compressed version of past observations.
The specific evolution is modeled by an encoder-decoder structure.
We estimate the evolution with an encoder-decoder neural network and give upper bounds for the expected forecast error.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we consider high-dimensional stationary processes where a new
observation is generated from a compressed version of past observations. The
specific evolution is modeled by an encoder-decoder structure. We estimate the
evolution with an encoder-decoder neural network and give upper bounds for the
expected forecast error under specific structural and sparsity assumptions. The
results are shown separately for conditions either on the absolutely regular
mixing coefficients or the functional dependence measure of the observed
process. In a quantitative simulation we discuss the behavior of the network
estimator under different model assumptions. We corroborate our theory by a
real data example where we consider forecasting temperature data.
Related papers
- Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Postprocessing of Ensemble Weather Forecasts Using Permutation-invariant
Neural Networks [0.0]
We propose networks that treat forecast ensembles as a set of unordered member forecasts.
We evaluate the quality of the obtained forecast distributions in terms of calibration and sharpness.
Our results suggest that most of the relevant information is contained in a few ensemble-internal degrees of freedom.
arXiv Detail & Related papers (2023-09-08T17:20:51Z) - Structured Radial Basis Function Network: Modelling Diversity for
Multiple Hypotheses Prediction [51.82628081279621]
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions.
A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems.
It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution.
arXiv Detail & Related papers (2023-09-02T01:27:53Z) - Theoretical analysis of deep neural networks for temporally dependent
observations [1.6752182911522522]
We study theoretical properties of deep neural networks on modeling non-linear time series data.
Results are supported via various numerical simulation settings as well as an application to a macroeconomic data set.
arXiv Detail & Related papers (2022-10-20T18:56:37Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Differentiable Generalised Predictive Coding [2.868176771215219]
This paper deals with differentiable dynamical models congruent with neural process theories that cast brain function as the hierarchical refinement of an internal generative model explaining observations.
Our work extends existing implementations of gradient-based predictive coding and allows to integrate deep neural networks for non-linear state parameterization.
arXiv Detail & Related papers (2021-12-02T22:02:56Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - Dynamic Time Warping as a New Evaluation for Dst Forecast with Machine
Learning [0.0]
We train a neural network to make a forecast of the disturbance storm time index at origin time $t$ with a forecasting horizon of 1 up to 6 hours.
Inspection of the model's results with the correlation coefficient and RMSE indicated a performance comparable to the latest publications.
A new method is proposed to measure whether two time series are shifted in time with respect to each other.
arXiv Detail & Related papers (2020-06-08T15:14:13Z) - Anomaly Detection And Classification In Time Series With Kervolutional
Neural Networks [1.3535770763481902]
In this paper, we explore the potential of kervolutional neural networks applied to time series data.
We demonstrate that using a mixture of convolutional and kervolutional layers improves the model performance.
We propose a residual-based anomaly detection approach using a temporal auto-encoder.
arXiv Detail & Related papers (2020-05-14T15:45:11Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.