UniTime: A Language-Empowered Unified Model for Cross-Domain Time Series
Forecasting
- URL: http://arxiv.org/abs/2310.09751v3
- Date: Fri, 23 Feb 2024 05:17:03 GMT
- Title: UniTime: A Language-Empowered Unified Model for Cross-Domain Time Series
Forecasting
- Authors: Xu Liu, Junfeng Hu, Yuan Li, Shizhe Diao, Yuxuan Liang, Bryan Hooi,
Roger Zimmermann
- Abstract summary: This research advocates for a unified model paradigm that transcends domain boundaries.
Learning an effective cross-domain model presents the following challenges.
We propose UniTime for effective cross-domain time series learning.
- Score: 59.11817101030137
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multivariate time series forecasting plays a pivotal role in contemporary web
technologies. In contrast to conventional methods that involve creating
dedicated models for specific time series application domains, this research
advocates for a unified model paradigm that transcends domain boundaries.
However, learning an effective cross-domain model presents the following
challenges. First, various domains exhibit disparities in data characteristics,
e.g., the number of variables, posing hurdles for existing models that impose
inflexible constraints on these factors. Second, the model may encounter
difficulties in distinguishing data from various domains, leading to suboptimal
performance in our assessments. Third, the diverse convergence rates of time
series domains can also result in compromised empirical performance. To address
these issues, we propose UniTime for effective cross-domain time series
learning. Concretely, UniTime can flexibly adapt to data with varying
characteristics. It also uses domain instructions and a Language-TS Transformer
to offer identification information and align two modalities. In addition,
UniTime employs masking to alleviate domain convergence speed imbalance issues.
Our extensive experiments demonstrate the effectiveness of UniTime in advancing
state-of-the-art forecasting performance and zero-shot transferability.
Related papers
- Towards Generalisable Time Series Understanding Across Domains [10.350643783811174]
We introduce OTiS, an open model for general time series analysis.
We propose a novel pre-training paradigm including a tokeniser with learnable domain-specific signatures.
Our model is pre-trained on a large corpus of 640,187 samples and 11 billion time points spanning 8 distinct domains.
arXiv Detail & Related papers (2024-10-09T17:09:30Z) - Towards Long-Context Time Series Foundation Models [17.224575072056627]
Time series foundation models have shown impressive performance on a variety of tasks, across a wide range of domains, even in zero-shot settings.
This study bridges the gap by systematically comparing various context expansion techniques from both language and time series domains.
arXiv Detail & Related papers (2024-09-20T14:19:59Z) - Cross-Domain Pre-training with Language Models for Transferable Time Series Representations [32.8353465232791]
CrossTimeNet is a novel cross-domain SSL learning framework to learn transferable knowledge from various domains.
One of the key characteristics of CrossTimeNet is the newly designed time series tokenization module.
We conduct extensive experiments in a real-world scenario across various time series classification domains.
arXiv Detail & Related papers (2024-03-19T02:32:47Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting [24.834846119163885]
We propose a novel framework, TEMPO, that can effectively learn time series representations.
TEMPO expands the capability for dynamically modeling real-world temporal phenomena from data within diverse domains.
arXiv Detail & Related papers (2023-10-08T00:02:25Z) - SALUDA: Surface-based Automotive Lidar Unsupervised Domain Adaptation [62.889835139583965]
We introduce an unsupervised auxiliary task of learning an implicit underlying surface representation simultaneously on source and target data.
As both domains share the same latent representation, the model is forced to accommodate discrepancies between the two sources of data.
Our experiments demonstrate that our method achieves a better performance than the current state of the art, both in real-to-real and synthetic-to-real scenarios.
arXiv Detail & Related papers (2023-04-06T17:36:23Z) - Optimal Event Monitoring through Internet Mashup over Multivariate Time
Series [77.34726150561087]
This framework supports the services of model definitions, querying, parameter learning, model evaluations, data monitoring, decision recommendations, and web portals.
We further extend the MTSA data model and query language to support this class of problems for the services of learning, monitoring, and recommendation.
arXiv Detail & Related papers (2022-10-18T16:56:17Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Cross-domain Time Series Forecasting with Attention Sharing [10.180248006928107]
We propose a novel domain adaptation framework,Domain Adaptation Forecaster (DAF), to cope with the issue of data scarcity.
In particular, we pro-pose an attention-based shared module with a do-main discriminator across domains as well as pri-vate modules for individual domains.
This allowsus to jointly train the source and target domains bygenerating domain-invariant latent features whileretraining domain-specific features.
arXiv Detail & Related papers (2021-02-13T00:26:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.