Time-series Imputation and Prediction with Bi-Directional Generative
Adversarial Networks
- URL: http://arxiv.org/abs/2009.08900v1
- Date: Fri, 18 Sep 2020 15:47:51 GMT
- Title: Time-series Imputation and Prediction with Bi-Directional Generative
Adversarial Networks
- Authors: Mehak Gupta, Rahmatollah Beheshti
- Abstract summary: We present a model for the combined task of imputing and predicting values for irregularly observed and varying length time-series data with missing entries.
Our model learns how to impute missing elements in-between (imputation) or outside of the input time steps (prediction), hence working as an effective any-time prediction tool for time-series data.
- Score: 0.3162999570707049
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multivariate time-series data are used in many classification and regression
predictive tasks, and recurrent models have been widely used for such tasks.
Most common recurrent models assume that time-series data elements are of equal
length and the ordered observations are recorded at regular intervals. However,
real-world time-series data have neither a similar length nor a same number of
observations. They also have missing entries, which hinders the performance of
predictive tasks. In this paper, we approach these issues by presenting a model
for the combined task of imputing and predicting values for the irregularly
observed and varying length time-series data with missing entries. Our proposed
model (Bi-GAN) uses a bidirectional recurrent network in a generative
adversarial setting. The generator is a bidirectional recurrent network that
receives actual incomplete data and imputes the missing values. The
discriminator attempts to discriminate between the actual and the imputed
values in the output of the generator. Our model learns how to impute missing
elements in-between (imputation) or outside of the input time steps
(prediction), hence working as an effective any-time prediction tool for
time-series data. Our method has three advantages to the state-of-the-art
methods in the field: (a) single model can be used for both imputation and
prediction tasks; (b) it can perform prediction task for time-series of varying
length with missing data; (c) it does not require to know the observation and
prediction time window during training which provides a flexible length of
prediction window for both long-term and short-term predictions. We evaluate
our model on two public datasets and on another large real-world electronic
health records dataset to impute and predict body mass index (BMI) values in
children and show its superior performance in both settings.
Related papers
- DAM: Towards A Foundation Model for Time Series Forecasting [0.8231118867997028]
We propose a neural model that takes randomly sampled histories and outputs an adjustable basis composition as a continuous function of time.
It involves three key components: (1) a flexible approach for using randomly sampled histories from a long-tail distribution; (2) a transformer backbone that is trained on these actively sampled histories to produce, as representational output; and (3) the basis coefficients of a continuous function of time.
arXiv Detail & Related papers (2024-07-25T08:48:07Z) - Probabilistic Imputation for Time-series Classification with Missing
Data [17.956329906475084]
We propose a novel framework for classification with time series data with missing values.
Our deep generative model part is trained to impute the missing values in multiple plausible ways.
The classifier part takes the time series data along with the imputed missing values and classifies signals.
arXiv Detail & Related papers (2023-08-13T10:04:13Z) - An End-to-End Time Series Model for Simultaneous Imputation and Forecast [14.756607742477252]
We develop an end-to-end time series model that aims to learn the inference relation and make a multiple-step ahead forecast.
Our framework trains jointly two neural networks, one to learn the feature-wise correlations and the other for the modeling of temporal behaviors.
arXiv Detail & Related papers (2023-06-01T15:08:22Z) - STING: Self-attention based Time-series Imputation Networks using GAN [4.052758394413726]
STING (Self-attention based Time-series Imputation Networks using GAN) is proposed.
We take advantage of generative adversarial networks and bidirectional recurrent neural networks to learn latent representations of the time series.
Experimental results on three real-world datasets demonstrate that STING outperforms the existing state-of-the-art methods in terms of imputation accuracy.
arXiv Detail & Related papers (2022-09-22T06:06:56Z) - A Generative Language Model for Few-shot Aspect-Based Sentiment Analysis [90.24921443175514]
We focus on aspect-based sentiment analysis, which involves extracting aspect term, category, and predicting their corresponding polarities.
We propose to reformulate the extraction and prediction tasks into the sequence generation task, using a generative language model with unidirectional attention.
Our approach outperforms the previous state-of-the-art (based on BERT) on average performance by a large margins in few-shot and full-shot settings.
arXiv Detail & Related papers (2022-04-11T18:31:53Z) - Conformal prediction for the design problem [72.14982816083297]
In many real-world deployments of machine learning, we use a prediction algorithm to choose what data to test next.
In such settings, there is a distinct type of distribution shift between the training and test data.
We introduce a method to quantify predictive uncertainty in such settings.
arXiv Detail & Related papers (2022-02-08T02:59:12Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Time-Series Imputation with Wasserstein Interpolation for Optimal
Look-Ahead-Bias and Variance Tradeoff [66.59869239999459]
In finance, imputation of missing returns may be applied prior to training a portfolio optimization model.
There is an inherent trade-off between the look-ahead-bias of using the full data set for imputation and the larger variance in the imputation from using only the training data.
We propose a Bayesian posterior consensus distribution which optimally controls the variance and look-ahead-bias trade-off in the imputation.
arXiv Detail & Related papers (2021-02-25T09:05:35Z) - Few-shot Learning for Time-series Forecasting [40.58524521473793]
We propose a few-shot learning method that forecasts a future value of a time-series in a target task given a few time-series in the target task.
Our model is trained using time-series data in multiple training tasks that are different from target tasks.
arXiv Detail & Related papers (2020-09-30T01:32:22Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.