IMTS-Mixer: Mixer-Networks for Irregular Multivariate Time Series Forecasting
- URL: http://arxiv.org/abs/2502.11816v1
- Date: Mon, 17 Feb 2025 14:06:36 GMT
- Title: IMTS-Mixer: Mixer-Networks for Irregular Multivariate Time Series Forecasting
- Authors: Christian Klötergens, Tim Dernedde, Lars Schmidt-Thieme,
- Abstract summary: We introduce IMTS-Mixer, a novel forecasting architecture designed specifically for IMTS.
Our approach retains the core principles of TS mixer models while introducing innovative methods to transform IMTS into fixed-size matrix representations.
Our results demonstrate that IMTS-Mixer establishes a new state-of-the-art in forecasting accuracy while also improving computational efficiency.
- Score: 5.854515369288696
- License:
- Abstract: Forecasting Irregular Multivariate Time Series (IMTS) has recently emerged as a distinct research field, necessitating specialized models to address its unique challenges. While most forecasting literature assumes regularly spaced observations without missing values, many real-world datasets - particularly in healthcare, climate research, and biomechanics - violate these assumptions. Time Series (TS)-mixer models have achieved remarkable success in regular multivariate time series forecasting. However, they remain unexplored for IMTS due to their requirement for complete and evenly spaced observations. To bridge this gap, we introduce IMTS-Mixer, a novel forecasting architecture designed specifically for IMTS. Our approach retains the core principles of TS mixer models while introducing innovative methods to transform IMTS into fixed-size matrix representations, enabling their seamless integration with mixer modules. We evaluate IMTS-Mixer on a benchmark of four real-world datasets from various domains. Our results demonstrate that IMTS-Mixer establishes a new state-of-the-art in forecasting accuracy while also improving computational efficiency.
Related papers
- General Time-series Model for Universal Knowledge Representation of Multivariate Time-Series data [61.163542597764796]
We show that time series with different time granularities (or corresponding frequency resolutions) exhibit distinct joint distributions in the frequency domain.
A novel Fourier knowledge attention mechanism is proposed to enable learning time-aware representations from both the temporal and frequency domains.
An autoregressive blank infilling pre-training framework is incorporated to time series analysis for the first time, leading to a generative tasks agnostic pre-training strategy.
arXiv Detail & Related papers (2025-02-05T15:20:04Z) - xLSTM-Mixer: Multivariate Time Series Forecasting by Mixing via Scalar Memories [20.773694998061707]
Time series data is prevalent across numerous fields, necessitating the development of robust and accurate forecasting models.
We introduce xLSTM-Mixer, a model designed to effectively integrate temporal sequences, joint time-variable information, and multiple perspectives for robust forecasting.
Our evaluations demonstrate xLSTM-Mixer's superior long-term forecasting performance compared to recent state-of-the-art methods.
arXiv Detail & Related papers (2024-10-22T11:59:36Z) - UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - UniTS: A Unified Multi-Task Time Series Model [31.675845788410246]
UniTS is a unified multi-task time series model that integrates predictive and generative tasks into a single framework.
UniTS is tested on 38 datasets across human activity sensors, healthcare, engineering, and finance.
arXiv Detail & Related papers (2024-02-29T21:25:58Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series [11.635608108358575]
We introduce Tiny Time Mixers (TTM), a compact model with effective transfer learning capabilities, trained exclusively on public TS datasets.
TTM incorporates innovations like adaptive patching, diverse resolution sampling, and resolution prefix tuning to handle pre-training on varied dataset resolutions.
It outperforms existing popular benchmarks in zero/few-shot forecasting by (4-40%), while reducing computational requirements significantly.
arXiv Detail & Related papers (2024-01-08T15:21:21Z) - Exploring Progress in Multivariate Time Series Forecasting: Comprehensive Benchmarking and Heterogeneity Analysis [70.78170766633039]
We address the need for means of assessing MTS forecasting proposals reliably and fairly.
BasicTS+ is a benchmark designed to enable fair, comprehensive, and reproducible comparison of MTS forecasting solutions.
We apply BasicTS+ along with rich datasets to assess the capabilities of more than 45 MTS forecasting solutions.
arXiv Detail & Related papers (2023-10-09T19:52:22Z) - Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting [52.47493322446537]
We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
arXiv Detail & Related papers (2023-05-22T13:39:44Z) - TSMixer: An All-MLP Architecture for Time Series Forecasting [41.178272171720316]
Time-Series Mixer (TSMixer) is a novel architecture designed by stacking multi-layer perceptrons (MLPs)
On popular academic benchmarks, the simple-to-implement TSMixer is comparable to specialized state-of-the-art models.
We present various analyses to shed light into the capabilities of TSMixer.
arXiv Detail & Related papers (2023-03-10T16:41:24Z) - MTS-Mixers: Multivariate Time Series Forecasting via Factorized Temporal
and Channel Mixing [18.058617044421293]
This paper investigates the contributions and deficiencies of attention mechanisms on the performance of time series forecasting.
We propose MTS-Mixers, which use two factorized modules to capture temporal and channel dependencies.
Experimental results on several real-world datasets show that MTS-Mixers outperform existing Transformer-based models with higher efficiency.
arXiv Detail & Related papers (2023-02-09T08:52:49Z) - LIFE: Learning Individual Features for Multivariate Time Series
Prediction with Missing Values [71.52335136040664]
We propose a Learning Individual Features (LIFE) framework, which provides a new paradigm for MTS prediction with missing values.
LIFE generates reliable features for prediction by using the correlated dimensions as auxiliary information and suppressing the interference from uncorrelated dimensions with missing values.
Experiments on three real-world data sets verify the superiority of LIFE to existing state-of-the-art models.
arXiv Detail & Related papers (2021-09-30T04:53:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.