Learning Mixture Structure on Multi-Source Time Series for Probabilistic
Forecasting
- URL: http://arxiv.org/abs/2302.11078v1
- Date: Wed, 22 Feb 2023 00:51:44 GMT
- Title: Learning Mixture Structure on Multi-Source Time Series for Probabilistic
Forecasting
- Authors: Tian Guo
- Abstract summary: We propose a neural mixture structure-based probability model for learning different predictive relations.
We present the prediction and uncertainty quantification methods that apply to different distributions of target variables.
- Score: 4.179947630802189
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In many data-driven applications, collecting data from different sources is
increasingly desirable for enhancing performance. In this paper, we are
interested in the problem of probabilistic forecasting with multi-source time
series. We propose a neural mixture structure-based probability model for
learning different predictive relations and their adaptive combinations from
multi-source time series. We present the prediction and uncertainty
quantification methods that apply to different distributions of target
variables. Additionally, given the imbalanced and unstable behaviors observed
during the direct training of the proposed mixture model, we develop a phased
learning method and provide a theoretical analysis. In experimental
evaluations, the mixture model trained by the phased learning exhibits
competitive performance on both point and probabilistic prediction metrics.
Meanwhile, the proposed uncertainty conditioned error suggests the potential of
the mixture model's uncertainty score as a reliability indicator of
predictions.
Related papers
- When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Invariant Probabilistic Prediction [45.90606906307022]
We show that arbitrary distribution shifts do not, in general, admit invariant and robust probabilistic predictions.
We propose a method to yield invariant probabilistic predictions, called IPP, and study the consistency of the underlying parameters.
arXiv Detail & Related papers (2023-09-18T18:50:24Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Better Batch for Deep Probabilistic Time Series Forecasting [15.31488551912888]
We propose an innovative training method that incorporates error autocorrelation to enhance probabilistic forecasting accuracy.
Our method constructs a mini-batch as a collection of $D$ consecutive time series segments for model training.
It explicitly learns a time-varying covariance matrix over each mini-batch, encoding error correlation among adjacent time steps.
arXiv Detail & Related papers (2023-05-26T15:36:59Z) - Scalable Dynamic Mixture Model with Full Covariance for Probabilistic
Traffic Forecasting [16.04029885574568]
We propose a dynamic mixture of zero-mean Gaussian distributions for the time-varying error process.
The proposed method can be seamlessly integrated into existing deep-learning frameworks with only a few additional parameters to be learned.
We evaluate the proposed method on a traffic speed forecasting task and find that our method not only improves model horizons but also provides interpretabletemporal correlation structures.
arXiv Detail & Related papers (2022-12-10T22:50:00Z) - How to Combine Variational Bayesian Networks in Federated Learning [0.0]
Federated learning enables multiple data centers to train a central model collaboratively without exposing any confidential data.
deterministic models are capable of performing high prediction accuracy, their lack of calibration and capability to quantify uncertainty is problematic for safety-critical applications.
We study the effects of various aggregation schemes for variational Bayesian neural networks.
arXiv Detail & Related papers (2022-06-22T07:53:12Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Balance-Subsampled Stable Prediction [55.13512328954456]
We propose a novel balance-subsampled stable prediction (BSSP) algorithm based on the theory of fractional factorial design.
A design-theoretic analysis shows that the proposed method can reduce the confounding effects among predictors induced by the distribution shift.
Numerical experiments on both synthetic and real-world data sets demonstrate that our BSSP algorithm significantly outperforms the baseline methods for stable prediction across unknown test data.
arXiv Detail & Related papers (2020-06-08T07:01:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.