SpectraNet: Multivariate Forecasting and Imputation under Distribution
Shifts and Missing Data
- URL: http://arxiv.org/abs/2210.12515v2
- Date: Tue, 25 Oct 2022 17:25:56 GMT
- Title: SpectraNet: Multivariate Forecasting and Imputation under Distribution
Shifts and Missing Data
- Authors: Cristian Challu, Peihong Jiang, Ying Nian Wu, Laurent Callot
- Abstract summary: SpectraNet is a novel time-series forecasting model that dynamically infers temporal dynamics and correlations on the recent observed history.
A Convolution Neural Network maps the learned representation by sequentially mixing its components and refining the output.
Our proposed approach can simultaneously produce forecasts and interpolate past observations and can, therefore, greatly simplify production systems.
- Score: 40.21502451136054
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, we tackle two widespread challenges in real applications for
time-series forecasting that have been largely understudied: distribution
shifts and missing data. We propose SpectraNet, a novel multivariate
time-series forecasting model that dynamically infers a latent space spectral
decomposition to capture current temporal dynamics and correlations on the
recent observed history. A Convolution Neural Network maps the learned
representation by sequentially mixing its components and refining the output.
Our proposed approach can simultaneously produce forecasts and interpolate past
observations and can, therefore, greatly simplify production systems by
unifying imputation and forecasting tasks into a single model. SpectraNet
achieves SoTA performance simultaneously on both tasks on five benchmark
datasets, compared to forecasting and imputation models, with up to 92% fewer
parameters and comparable training times. On settings with up to 80% missing
data, SpectraNet has average performance improvements of almost 50% over the
second-best alternative. Our code is available at
https://github.com/cchallu/spectranet.
Related papers
- MGCP: A Multi-Grained Correlation based Prediction Network for Multivariate Time Series [54.91026286579748]
We propose a Multi-Grained Correlations-based Prediction Network.
It simultaneously considers correlations at three levels to enhance prediction performance.
It employs adversarial training with an attention mechanism-based predictor and conditional discriminator to optimize prediction results at coarse-grained level.
arXiv Detail & Related papers (2024-05-30T03:32:44Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Perceiver-based CDF Modeling for Time Series Forecasting [25.26713741799865]
We propose a new architecture, called perceiver-CDF, for modeling cumulative distribution functions (CDF) of time series data.
Our approach combines the perceiver architecture with a copula-based attention mechanism tailored for multimodal time series prediction.
Experiments on the unimodal and multimodal benchmarks consistently demonstrate a 20% improvement over state-of-the-art methods.
arXiv Detail & Related papers (2023-10-03T01:13:17Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Deep Generative model with Hierarchical Latent Factors for Time Series
Anomaly Detection [40.21502451136054]
This work presents DGHL, a new family of generative models for time series anomaly detection.
A top-down Convolution Network maps a novel hierarchical latent space to time series windows, exploiting temporal dynamics to encode information efficiently.
Our method outperformed current state-of-the-art models on four popular benchmark datasets.
arXiv Detail & Related papers (2022-02-15T17:19:44Z) - Neural forecasting at scale [8.245069318446415]
We study the problem of efficiently scaling ensemble-based deep neural networks for time series (TS) forecasting on a large set of time series.
Our model addresses the practical limitations of related models, reducing the training time by half and memory requirement by a factor of 5.
arXiv Detail & Related papers (2021-09-20T17:22:40Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - Towards physically consistent data-driven weather forecasting:
Integrating data assimilation with equivariance-preserving deep spatial
transformers [2.7998963147546148]
We propose 3 components to integrate with commonly used data-driven weather prediction models.
These components are 1) a deep spatial transformer added to latent space of U-NETs to preserve equivariance, 2) a data-assimilation algorithm to ingest noisy observations and improve the initial conditions for next forecasts, and 3) a multi-time-step algorithm, improving the accuracy of forecasts at short intervals.
arXiv Detail & Related papers (2021-03-16T23:15:00Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z) - ForecastNet: A Time-Variant Deep Feed-Forward Neural Network
Architecture for Multi-Step-Ahead Time-Series Forecasting [6.043572971237165]
We propose ForecastNet, which uses a deep feed-forward architecture to provide a time-variant model.
ForecastNet is demonstrated to outperform statistical and deep learning benchmark models on several datasets.
arXiv Detail & Related papers (2020-02-11T01:03:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.