Masked Multi-Step Multivariate Time Series Forecasting with Future
Information
- URL: http://arxiv.org/abs/2209.14413v1
- Date: Wed, 28 Sep 2022 20:49:11 GMT
- Title: Masked Multi-Step Multivariate Time Series Forecasting with Future
Information
- Authors: Yiwei Fu, Honggang Wang, Nurali Virani
- Abstract summary: In many real-world forecasting scenarios, some future information is known, e.g., the weather information when making a short-to-mid-term electricity demand forecast.
To overcome the limitations of existing approaches, we propose MMMF, a framework to train any neural network model capable of generating a sequence of outputs.
Experiments are performed on two real-world datasets for (1) mid-term electricity demand forecasting, and (2) two-month ahead flight departures.
- Score: 7.544120398993689
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we introduce Masked Multi-Step Multivariate Forecasting
(MMMF), a novel and general self-supervised learning framework for time series
forecasting with known future information. In many real-world forecasting
scenarios, some future information is known, e.g., the weather information when
making a short-to-mid-term electricity demand forecast, or the oil price
forecasts when making an airplane departure forecast. Existing machine learning
forecasting frameworks can be categorized into (1) sample-based approaches
where each forecast is made independently, and (2) time series regression
approaches where the future information is not fully incorporated. To overcome
the limitations of existing approaches, we propose MMMF, a framework to train
any neural network model capable of generating a sequence of outputs, that
combines both the temporal information from the past and the known information
about the future to make better predictions. Experiments are performed on two
real-world datasets for (1) mid-term electricity demand forecasting, and (2)
two-month ahead flight departures forecasting. They show that the proposed MMMF
framework outperforms not only sample-based methods but also existing time
series forecasting models with the exact same base models. Furthermore, once a
neural network model is trained with MMMF, its inference speed is similar to
that of the same model trained with traditional regression formulations, thus
making MMMF a better alternative to existing regression-trained time series
forecasting models if there is some available future information.
Related papers
- Learning Augmentation Policies from A Model Zoo for Time Series Forecasting [58.66211334969299]
We introduce AutoTSAug, a learnable data augmentation method based on reinforcement learning.
By augmenting the marginal samples with a learnable policy, AutoTSAug substantially improves forecasting performance.
arXiv Detail & Related papers (2024-09-10T07:34:19Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Weather Prediction with Diffusion Guided by Realistic Forecast Processes [49.07556359513563]
We introduce a novel method that applies diffusion models (DM) for weather forecasting.
Our method can achieve both direct and iterative forecasting with the same modeling framework.
The flexibility and controllability of our model empowers a more trustworthy DL system for the general weather community.
arXiv Detail & Related papers (2024-02-06T21:28:42Z) - Handling Concept Drift in Global Time Series Forecasting [10.732102570751392]
We propose two new concept drift handling methods, namely: Error Contribution Weighting (ECW) and Gradient Descent Weighting (GDW)
These methods use two forecasting models which are separately trained with the most recent series and all series, and finally, the weighted average of the forecasts provided by the two models are considered as the final forecasts.
arXiv Detail & Related papers (2023-04-04T03:46:25Z) - Masked Multi-Step Probabilistic Forecasting for Short-to-Mid-Term
Electricity Demand [7.544120398993689]
Masked Multi-Step Multi Probabilistic Forecasting (MMMPF) is a novel and general framework to train any neural network model.
It combines both the temporal information from the past and the known information about the future to make probabilistic predictions.
MMMPF can also generate desired quantiles to capture uncertainty and enable probabilistic planning for grid of the future.
arXiv Detail & Related papers (2023-02-14T04:09:03Z) - Meta-Forecasting by combining Global DeepRepresentations with Local
Adaptation [12.747008878068314]
We introduce a novel forecasting method called Meta Global-Local Auto-Regression (Meta-GLAR)
It adapts to each time series by learning in closed-form the mapping from the representations produced by a recurrent neural network (RNN) to one-step-ahead forecasts.
Our method is competitive with the state-of-the-art in out-of-sample forecasting accuracy reported in earlier work.
arXiv Detail & Related papers (2021-11-05T11:45:02Z) - Back2Future: Leveraging Backfill Dynamics for Improving Real-time
Predictions in Future [73.03458424369657]
In real-time forecasting in public health, data collection is a non-trivial and demanding task.
'Backfill' phenomenon and its effect on model performance has been barely studied in the prior literature.
We formulate a novel problem and neural framework Back2Future that aims to refine a given model's predictions in real-time.
arXiv Detail & Related papers (2021-06-08T14:48:20Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z) - Improving the Accuracy of Global Forecasting Models using Time Series
Data Augmentation [7.38079566297881]
Forecasting models that are trained across sets of many time series, known as Global Forecasting Models (GFM), have shown promising results in forecasting competitions and real-world applications.
We propose a novel, data augmentation based forecasting framework that is capable of improving the baseline accuracy of GFM models in less data-abundant settings.
arXiv Detail & Related papers (2020-08-06T13:52:20Z) - A framework for probabilistic weather forecast post-processing across
models and lead times using machine learning [3.1542695050861544]
We show how to bridge the gap between sets of separate forecasts from NWP models and the 'ideal' forecast for decision support.
We use Quantile Regression Forests to learn the error profile of each numerical model, and use these to apply empirically-derived probability distributions to forecasts.
Second, we combine these probabilistic forecasts using quantile averaging. Third, we interpolate between the aggregate quantiles in order to generate a full predictive distribution.
arXiv Detail & Related papers (2020-05-06T16:46:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.