RECOWNs: Probabilistic Circuits for Trustworthy Time Series Forecasting
- URL: http://arxiv.org/abs/2106.04148v1
- Date: Tue, 8 Jun 2021 07:32:12 GMT
- Title: RECOWNs: Probabilistic Circuits for Trustworthy Time Series Forecasting
- Authors: Nils Thoma, Zhongjie Yu, Fabrizio Ventola, Kristian Kersting
- Abstract summary: Recurrent Neural Networks (RNNs) are the models of choice for time series forecasting.
WSPNs, prominent deep tractable probabilistic circuits (PCs) for time series, can assist an RNN with providing meaningful probabilities as uncertainty measure.
We propose RECOWN, a novel architecture that employs RNNs and a discriminant variant of WSPNs called Conditional WSPNs (CWSPNs)
- Score: 14.65383490077168
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series forecasting is a relevant task that is performed in several
real-world scenarios such as product sales analysis and prediction of energy
demand. Given their accuracy performance, currently, Recurrent Neural Networks
(RNNs) are the models of choice for this task. Despite their success in time
series forecasting, less attention has been paid to make the RNNs trustworthy.
For example, RNNs can not naturally provide an uncertainty measure to their
predictions. This could be extremely useful in practice in several cases e.g.
to detect when a prediction might be completely wrong due to an unusual pattern
in the time series. Whittle Sum-Product Networks (WSPNs), prominent deep
tractable probabilistic circuits (PCs) for time series, can assist an RNN with
providing meaningful probabilities as uncertainty measure. With this aim, we
propose RECOWN, a novel architecture that employs RNNs and a discriminant
variant of WSPNs called Conditional WSPNs (CWSPNs). We also formulate a
Log-Likelihood Ratio Score as better estimation of uncertainty that is tailored
to time series and Whittle likelihoods. In our experiments, we show that
RECOWNs are accurate and trustworthy time series predictors, able to "know when
they do not know".
Related papers
- Uncertainty in Graph Neural Networks: A Survey [50.63474656037679]
Graph Neural Networks (GNNs) have been extensively used in various real-world applications.
However, the predictive uncertainty of GNNs stemming from diverse sources can lead to unstable and erroneous predictions.
This survey aims to provide a comprehensive overview of the GNNs from the perspective of uncertainty.
arXiv Detail & Related papers (2024-03-11T21:54:52Z) - Early-Exit Neural Networks with Nested Prediction Sets [26.618810100134862]
Early-exit neural networks (EENNs) enable adaptive and efficient inference by providing predictions at multiple stages during the forward pass.
Standard Bayesian techniques such as conformal prediction and credible sets are not suitable for EENNs.
We investigate anytime-valid confidence sequences (AVCSs)
These sequences are inherently nested and thus well-suited for an EENN's sequential predictions.
arXiv Detail & Related papers (2023-11-10T08:38:18Z) - Uncertainty Quantification over Graph with Conformalized Graph Neural
Networks [52.20904874696597]
Graph Neural Networks (GNNs) are powerful machine learning prediction models on graph-structured data.
GNNs lack rigorous uncertainty estimates, limiting their reliable deployment in settings where the cost of errors is significant.
We propose conformalized GNN (CF-GNN), extending conformal prediction (CP) to graph-based models for guaranteed uncertainty estimates.
arXiv Detail & Related papers (2023-05-23T21:38:23Z) - Probabilistic Time Series Forecasting with Implicit Quantile Networks [0.7249731529275341]
We combine an autoregressive recurrent neural network to model temporal dynamics with Implicit Quantile Networks to learn a large class of distributions over a time-series target.
Our approach is favorable in terms of point-wise prediction accuracy as well as on estimating the underlying temporal distribution.
arXiv Detail & Related papers (2021-07-08T10:37:24Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - An Infinite-Feature Extension for Bayesian ReLU Nets That Fixes Their
Asymptotic Overconfidence [65.24701908364383]
A Bayesian treatment can mitigate overconfidence in ReLU nets around the training data.
But far away from them, ReLU neural networks (BNNs) can still underestimate uncertainty and thus be overconfident.
We show that it can be applied emphpost-hoc to any pre-trained ReLU BNN at a low cost.
arXiv Detail & Related papers (2020-10-06T13:32:18Z) - Deep Distributional Time Series Models and the Probabilistic Forecasting
of Intraday Electricity Prices [0.0]
We propose two approaches to constructing deep time series probabilistic models.
The first is where the output layer of the ESN has disturbances and a shrinkage prior for additional regularization.
The second approach employs the implicit copula of an ESN with Gaussian disturbances, which is a deep copula process on the feature space.
arXiv Detail & Related papers (2020-10-05T08:02:29Z) - Frequentist Uncertainty in Recurrent Neural Networks via Blockwise
Influence Functions [121.10450359856242]
Recurrent neural networks (RNNs) are instrumental in modelling sequential and time-series data.
Existing approaches for uncertainty quantification in RNNs are based predominantly on Bayesian methods.
We develop a frequentist alternative that: (a) does not interfere with model training or compromise its accuracy, (b) applies to any RNN architecture, and (c) provides theoretical coverage guarantees on the estimated uncertainty intervals.
arXiv Detail & Related papers (2020-06-20T22:45:32Z) - Spatiotemporal Adaptive Neural Network for Long-term Forecasting of
Financial Time Series [0.2793095554369281]
We investigate whether deep neural networks (DNNs) can be used to forecast time series (TS) forecasts conjointly.
We make use of the dynamic factor graph (DFG) to build a multivariate autoregressive model.
With ACTM, it is possible to vary the autoregressive order of a TS model over time and model a larger set of probability distributions.
arXiv Detail & Related papers (2020-03-27T00:53:11Z) - Zero-shot and few-shot time series forecasting with ordinal regression
recurrent neural networks [17.844338213026976]
Recurrent neural networks (RNNs) are state-of-the-art in several sequential learning tasks, but they often require considerable amounts of data to generalise well.
We propose a novel RNN-based model that directly addresses this problem by learning a shared feature embedding over the space of many quantised time series.
We show how this enables our RNN framework to accurately and reliably forecast unseen time series, even when there is little to no training data available.
arXiv Detail & Related papers (2020-03-26T21:33:10Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.