S4M: S4 for multivariate time series forecasting with Missing values
- URL: http://arxiv.org/abs/2503.00900v1
- Date: Sun, 02 Mar 2025 13:59:59 GMT
- Title: S4M: S4 for multivariate time series forecasting with Missing values
- Authors: Jing Peng, Meiqi Yang, Qiong Zhang, Xiaoxiao Li,
- Abstract summary: Time series data play a pivotal role in a wide range of real-world applications.<n>Traditional two-step approaches, which first impute missing values and then perform forecasting, are prone to error accumulation.<n>We introduce S4M, an end-to-end time series forecasting framework that seamlessly integrates missing data handling into the Structured State Space Sequence model architecture.
- Score: 30.547886613423994
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multivariate time series data play a pivotal role in a wide range of real-world applications. However, the presence of block missing data introduces significant challenges, often compromising the performance of predictive models. Traditional two-step approaches, which first impute missing values and then perform forecasting, are prone to error accumulation, particularly in complex multivariate settings characterized by high missing ratios and intricate dependency structures. In this work, we introduce S4M, an end-to-end time series forecasting framework that seamlessly integrates missing data handling into the Structured State Space Sequence (S4) model architecture. Unlike conventional methods that treat imputation as a separate preprocessing step, S4M leverages the latent space of S4 models to directly recognize and represent missing data patterns, thereby more effectively capturing the underlying temporal and multivariate dependencies. Our framework comprises two key components: the Adaptive Temporal Prototype Mapper (ATPM) and the Missing-Aware Dual Stream S4 (MDS-S4). The ATPM employs a prototype bank to derive robust and informative representations from historical data patterns, while the MDS-S4 processes these representations alongside missingness masks as dual input streams to enable accurate forecasting. Through extensive empirical evaluations on diverse real-world datasets, we demonstrate that S4M consistently achieves state-of-the-art performance. These results underscore the efficacy of our integrated approach in handling missing data, showcasing its robustness and superiority over traditional imputation-based methods. Our findings highlight the potential of S4M to advance reliable time series forecasting in practical applications, offering a promising direction for future research and deployment. Code is available at https://github.com/WINTERWEEL/S4M.git.
Related papers
- PFformer: A Position-Free Transformer Variant for Extreme-Adaptive Multivariate Time Series Forecasting [9.511600544581425]
PFformer is a position-free Transformer-based model designed for single-target MTS forecasting.<n> PFformer integrates two novel embedding strategies: Enhanced Feature-based Embedding (EFE) and Auto-Encoder-based Embedding (AEE)
arXiv Detail & Related papers (2025-02-27T22:21:27Z) - DeformTime: Capturing Variable Dependencies with Deformable Attention for Time Series Forecasting [0.34530027457862006]
We present DeformTime, a neural network architecture that attempts to capture correlated temporal patterns from the input space.
We conduct extensive experiments on 6 MTS data sets, using previously established benchmarks as well as challenging infectious disease modelling tasks.
Results demonstrate that DeformTime improves accuracy against previous competitive methods across the vast majority of MTS forecasting tasks.
arXiv Detail & Related papers (2024-06-11T16:45:48Z) - UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - Sports-Traj: A Unified Trajectory Generation Model for Multi-Agent Movement in Sports [53.637837706712794]
We propose a Unified Trajectory Generation model, UniTraj, that processes arbitrary trajectories as masked inputs.<n>Specifically, we introduce a Ghost Spatial Masking (GSM) module, embedded within a Transformer encoder, for spatial feature extraction.<n>We benchmark three practical sports datasets, Basketball-U, Football-U, and Soccer-U, for evaluation.
arXiv Detail & Related papers (2024-05-27T22:15:23Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Deep Latent State Space Models for Time-Series Generation [68.45746489575032]
We propose LS4, a generative model for sequences with latent variables evolving according to a state space ODE.
Inspired by recent deep state space models (S4), we achieve speedups by leveraging a convolutional representation of LS4.
We show that LS4 significantly outperforms previous continuous-time generative models in terms of marginal distribution, classification, and prediction scores on real-world datasets.
arXiv Detail & Related papers (2022-12-24T15:17:42Z) - Liquid Structural State-Space Models [106.74783377913433]
Liquid-S4 achieves an average performance of 87.32% on the Long-Range Arena benchmark.
On the full raw Speech Command recognition, dataset Liquid-S4 achieves 96.78% accuracy with a 30% reduction in parameter counts compared to S4.
arXiv Detail & Related papers (2022-09-26T18:37:13Z) - Diffusion-based Time Series Imputation and Forecasting with Structured
State Space Models [2.299617836036273]
We put forward SSSD, an imputation model that relies on two emerging technologies,conditional diffusion models and structured state space models.
We demonstrate that SSSD matches or even exceeds state-of-the-art probabilistic imputation and forecasting performance on a broad range of data sets and different missingness scenarios.
arXiv Detail & Related papers (2022-08-19T15:29:43Z) - Inertial Hallucinations -- When Wearable Inertial Devices Start Seeing
Things [82.15959827765325]
We propose a novel approach to multimodal sensor fusion for Ambient Assisted Living (AAL)
We address two major shortcomings of standard multimodal approaches, limited area coverage and reduced reliability.
Our new framework fuses the concept of modality hallucination with triplet learning to train a model with different modalities to handle missing sensors at inference time.
arXiv Detail & Related papers (2022-07-14T10:04:18Z) - Deep Direct Discriminative Decoders for High-dimensional Time-series
Data Analysis [0.0]
State-space models (SSMs) are widely utilized in the analysis of time-series data.
We propose a new formulation of SSM for high-dimensional observation processes.
We build a novel solution that efficiently estimates the underlying state processes through high-dimensional observation signal.
arXiv Detail & Related papers (2022-05-22T22:44:41Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Missing Value Imputation on Multidimensional Time Series [16.709162372224355]
We present DeepMVI, a deep learning method for missing value imputation in multidimensional time-series datasets.
DeepMVI combines fine-grained and coarse-grained patterns along a time series, and trends from related series across categorical dimensions.
Experiments show that DeepMVI is significantly more accurate, reducing error by more than 50% in more than half the cases.
arXiv Detail & Related papers (2021-03-02T09:55:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.