Towards Long-Context Time Series Foundation Models
- URL: http://arxiv.org/abs/2409.13530v1
- Date: Fri, 20 Sep 2024 14:19:59 GMT
- Title: Towards Long-Context Time Series Foundation Models
- Authors: Nina Żukowska, Mononito Goswami, Michał Wiliński, Willa Potosnak, Artur Dubrawski,
- Abstract summary: Time series foundation models have shown impressive performance on a variety of tasks, across a wide range of domains, even in zero-shot settings.
This study bridges the gap by systematically comparing various context expansion techniques from both language and time series domains.
- Score: 17.224575072056627
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series foundation models have shown impressive performance on a variety of tasks, across a wide range of domains, even in zero-shot settings. However, most of these models are designed to handle short univariate time series as an input. This limits their practical use, especially in domains such as healthcare with copious amounts of long and multivariate data with strong temporal and intra-variate dependencies. Our study bridges this gap by cataloging and systematically comparing various context expansion techniques from both language and time series domains, and introducing a novel compressive memory mechanism to allow encoder-only TSFMs to effectively model intra-variate dependencies. We demonstrate the benefits of our approach by imbuing MOMENT, a recent family of multi-task time series foundation models, with the multivariate context.
Related papers
- Generalized Prompt Tuning: Adapting Frozen Univariate Time Series Foundation Models for Multivariate Healthcare Time Series [3.9599054392856483]
Time series foundation models are pre-trained on large datasets and are able to achieve state-of-the-art performance in diverse tasks.
We propose a prompt-tuning-inspired fine-tuning technique, Gen-P-Tuning, that enables us to adapt an existing univariate time series foundation model.
We demonstrate the effectiveness of our fine-tuning approach against various baselines on two MIMIC classification tasks, and on influenza-like illness forecasting.
arXiv Detail & Related papers (2024-11-19T19:20:58Z) - Towards Generalisable Time Series Understanding Across Domains [10.350643783811174]
We introduce OTiS, an open model for general time series analysis.
We propose a novel pre-training paradigm including a tokeniser with learnable domain-specific signatures.
Our model is pre-trained on a large corpus of 640,187 samples and 11 billion time points spanning 8 distinct domains.
arXiv Detail & Related papers (2024-10-09T17:09:30Z) - Deep Time Series Models: A Comprehensive Survey and Benchmark [74.28364194333447]
Time series data is of great significance in real-world scenarios.
Recent years have witnessed remarkable breakthroughs in the time series community.
We release Time Series Library (TSLib) as a fair benchmark of deep time series models for diverse analysis tasks.
arXiv Detail & Related papers (2024-07-18T08:31:55Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - UniTime: A Language-Empowered Unified Model for Cross-Domain Time Series
Forecasting [59.11817101030137]
This research advocates for a unified model paradigm that transcends domain boundaries.
Learning an effective cross-domain model presents the following challenges.
We propose UniTime for effective cross-domain time series learning.
arXiv Detail & Related papers (2023-10-15T06:30:22Z) - Forecasting large collections of time series: feature-based methods [7.353918137830393]
When forecasting large collections of time series, two lines of approaches have been developed using time series features.
This chapter discusses the state-of-the-art feature-based methods, with reference to open-source software implementations.
arXiv Detail & Related papers (2023-09-25T01:23:02Z) - SageFormer: Series-Aware Framework for Long-term Multivariate Time Series Forecasting [16.395374003276817]
This paper introduces a novel series-aware framework, explicitly designed to emphasize the significance of inter-series dependencies.
As a Series-aware Graph-enhanced Transformer model, SageFormer proficiently discerns and models the intricate relationships between series using graph structures.
Notably, the series-aware framework seamlessly integrates with existing Transformer-based models, enriching their ability to comprehend inter-series relationships.
arXiv Detail & Related papers (2023-07-04T10:08:25Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Instance-wise Graph-based Framework for Multivariate Time Series
Forecasting [69.38716332931986]
We propose a simple yet efficient instance-wise graph-based framework to utilize the inter-dependencies of different variables at different time stamps.
The key idea of our framework is aggregating information from the historical time series of different variables to the current time series that we need to forecast.
arXiv Detail & Related papers (2021-09-14T07:38:35Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.