PromptCast: A New Prompt-based Learning Paradigm for Time Series
Forecasting
- URL: http://arxiv.org/abs/2210.08964v5
- Date: Sun, 10 Dec 2023 23:53:41 GMT
- Title: PromptCast: A New Prompt-based Learning Paradigm for Time Series
Forecasting
- Authors: Hao Xue and Flora D. Salim
- Abstract summary: In existing time series forecasting methods, the models take a sequence of numerical values as input and yield numerical values as output.
Inspired by the successes of pre-trained language foundation models, we propose a new forecasting paradigm: prompt-based time series forecasting.
In this novel task, the numerical input and output are transformed into prompts and the forecasting task is framed in a sentence-to-sentence manner.
- Score: 11.670324826998968
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a new perspective on time series forecasting. In existing
time series forecasting methods, the models take a sequence of numerical values
as input and yield numerical values as output. The existing SOTA models are
largely based on the Transformer architecture, modified with multiple encoding
mechanisms to incorporate the context and semantics around the historical data.
Inspired by the successes of pre-trained language foundation models, we pose a
question about whether these models can also be adapted to solve time-series
forecasting. Thus, we propose a new forecasting paradigm: prompt-based time
series forecasting (PromptCast). In this novel task, the numerical input and
output are transformed into prompts and the forecasting task is framed in a
sentence-to-sentence manner, making it possible to directly apply language
models for forecasting purposes. To support and facilitate the research of this
task, we also present a large-scale dataset (PISA) that includes three
real-world forecasting scenarios. We evaluate different SOTA numerical-based
forecasting methods and language generation models. The benchmark results with
various forecasting settings demonstrate the proposed PromptCast with language
generation models is a promising research direction. Additionally, in
comparison to conventional numerical-based forecasting, PromptCast shows a much
better generalization ability under the zero-shot setting.
Related papers
- Context is Key: A Benchmark for Forecasting with Essential Textual Information [87.3175915185287]
"Context is Key" (CiK) is a time series forecasting benchmark that pairs numerical data with diverse types of carefully crafted textual context.
We evaluate a range of approaches, including statistical models, time series foundation models, and LLM-based forecasters.
Our experiments highlight the importance of incorporating contextual information, demonstrate surprising performance when using LLM-based forecasting models, and also reveal some of their critical shortcomings.
arXiv Detail & Related papers (2024-10-24T17:56:08Z) - Metadata Matters for Time Series: Informative Forecasting with Transformers [70.38241681764738]
We propose a Metadata-informed Time Series Transformer (MetaTST) for time series forecasting.
To tackle the unstructured nature of metadata, MetaTST formalizes them into natural languages by pre-designed templates.
A Transformer encoder is employed to communicate series and metadata tokens, which can extend series representations by metadata information.
arXiv Detail & Related papers (2024-10-04T11:37:55Z) - From News to Forecast: Integrating Event Analysis in LLM-Based Time Series Forecasting with Reflection [16.47323362700347]
We introduce a novel approach to enhance time series forecasting by reasoning across both text and time series data.
With language as a medium, our method adaptively integrates social events into forecasting models, aligning news content with time series fluctuations to provide richer insights.
Specifically, we utilize LLM-based agents to iteratively filter out irrelevant news and employ human-like reasoning to evaluate predictions.
arXiv Detail & Related papers (2024-09-26T03:50:22Z) - Prompt Mining for Language-based Human Mobility Forecasting [10.325794804095889]
We propose a novel framework for prompt mining in language-based mobility forecasting.
The framework includes a prompt generation stage based on the information entropy of prompts and a prompt refinement stage to integrate mechanisms such as the chain of thought.
arXiv Detail & Related papers (2024-03-06T08:43:30Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - A decoder-only foundation model for time-series forecasting [23.824504640087753]
Our model is based on pretraining a patched-decoder style attention model on a large time-series corpus.
It can work well across different forecasting history lengths, prediction lengths and temporal granularities.
arXiv Detail & Related papers (2023-10-14T17:01:37Z) - Lag-Llama: Towards Foundation Models for Probabilistic Time Series
Forecasting [54.04430089029033]
We present Lag-Llama, a general-purpose foundation model for time series forecasting based on a decoder-only transformer architecture.
Lag-Llama is pretrained on a large corpus of diverse time series data from several domains, and demonstrates strong zero-shot generalization capabilities.
When fine-tuned on relatively small fractions of such previously unseen datasets, Lag-Llama achieves state-of-the-art performance.
arXiv Detail & Related papers (2023-10-12T12:29:32Z) - Time-LLM: Time Series Forecasting by Reprogramming Large Language Models [110.20279343734548]
Time series forecasting holds significant importance in many real-world dynamic systems.
We present Time-LLM, a reprogramming framework to repurpose large language models for time series forecasting.
Time-LLM is a powerful time series learner that outperforms state-of-the-art, specialized forecasting models.
arXiv Detail & Related papers (2023-10-03T01:31:25Z) - Leveraging Language Foundation Models for Human Mobility Forecasting [8.422257363944295]
We propose a novel pipeline that leverages language foundation models for temporal sequential pattern mining.
We perform the forecasting task directly on the natural language input that includes all kinds of information.
Specific prompts are introduced to transform numerical temporal sequences into sentences so that existing language models can be directly applied.
arXiv Detail & Related papers (2022-09-11T01:15:16Z) - Forecast with Forecasts: Diversity Matters [9.66075743192747]
In recent years, the idea of using time series features to construct forecast combination model has flourished in the forecasting area.
In this work, we suggest a change of focus from the historical data to the produced forecasts to extract features.
We calculate the diversity of a pool of models based on the corresponding forecasts as a decisive feature and use meta-learning to construct diversity-based forecast combination models.
arXiv Detail & Related papers (2020-12-03T02:14:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.