Performance of Zero-Shot Time Series Foundation Models on Cloud Data
- URL: http://arxiv.org/abs/2502.12944v1
- Date: Tue, 18 Feb 2025 15:28:02 GMT
- Title: Performance of Zero-Shot Time Series Foundation Models on Cloud Data
- Authors: William Toner, Thomas L. Lee, Artjom Joosen, Rajkarn Singh, Martin Asenov,
- Abstract summary: Time series foundation models (FMs) have emerged as a popular paradigm for zero-shot multi-domain forecasting.
We demonstrate that many well-known FMs fail to generate meaningful or accurate zero-shot forecasts in this setting.
We also illustrate a number of interesting pathologies, including instances where FMs suddenly output seemingly erratic, random-looking forecasts.
- Score: 0.32622301272834525
- License:
- Abstract: Time series foundation models (FMs) have emerged as a popular paradigm for zero-shot multi-domain forecasting. FMs are trained on numerous diverse datasets and claim to be effective forecasters across multiple different time series domains, including cloud data. In this work we investigate this claim, exploring the effectiveness of FMs on cloud data. We demonstrate that many well-known FMs fail to generate meaningful or accurate zero-shot forecasts in this setting. We support this claim empirically, showing that FMs are outperformed consistently by simple linear baselines. We also illustrate a number of interesting pathologies, including instances where FMs suddenly output seemingly erratic, random-looking forecasts. Our results suggest a widespread failure of FMs to model cloud data.
Related papers
- Lightweight Online Adaption for Time Series Foundation Model Forecasts [0.32622301272834525]
AdapTS is a lightweight mechanism for the online adaption of FM forecasts in response to online feedback.
We evaluate the performance of AdapTS in conjunction with several recent FMs across a suite of standard time series datasets.
arXiv Detail & Related papers (2025-02-18T15:01:02Z) - Fine-Tuning Foundation Models with Federated Learning for Privacy Preserving Medical Time Series Forecasting [0.32985979395737786]
Federated Learning (FL) provides a decentralized machine learning approach, where multiple devices or servers collaboratively train a model without sharing their raw data.
In this paper, we fine-tune time series FMs with Electrocardiogram (ECG) and Impedance Cardiography (ICG) data using different FL techniques.
Our empirical results demonstrated that while FL can be effective for fine-tuning FMs on time series forecasting tasks, its benefits depend on the data distribution across clients.
arXiv Detail & Related papers (2025-02-13T20:01:15Z) - Time Series Foundational Models: Their Role in Anomaly Detection and Prediction [0.0]
Time series foundational models (TSFM) have gained prominence in time series forecasting.
This paper critically evaluates the efficacy of TSFM in anomaly detection and prediction tasks.
arXiv Detail & Related papers (2024-12-26T17:15:30Z) - Enabling Time-series Foundation Model for Building Energy Forecasting via Contrastive Curriculum Learning [12.19823790689484]
We study the adaptation of foundation models (FMs) to building energy forecasting tasks.
We propose a new textitcontrastive curriculum learning-based training method.
Experiments show that our method can improve the zero/few-shot performance by 14.6% compared to the existing FMs.
arXiv Detail & Related papers (2024-12-23T05:07:06Z) - Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts [103.725112190618]
This paper introduces Moirai-MoE, using a single input/output projection layer while delegating the modeling of diverse time series patterns to the sparse mixture of experts.
Extensive experiments on 39 datasets demonstrate the superiority of Moirai-MoE over existing foundation models in both in-distribution and zero-shot scenarios.
arXiv Detail & Related papers (2024-10-14T13:01:11Z) - Foundation Models for Time Series Analysis: A Tutorial and Survey [70.43311272903334]
Foundation Models (FMs) have fundamentally reshaped the paradigm of model design for time series analysis.
This survey aims to furnish a comprehensive and up-to-date overview of FMs for time series analysis.
arXiv Detail & Related papers (2024-03-21T10:08:37Z) - Time Series Diffusion in the Frequency Domain [54.60573052311487]
We analyze whether representing time series in the frequency domain is a useful inductive bias for score-based diffusion models.
We show that a dual diffusion process occurs in the frequency domain with an important nuance.
We show how to adapt the denoising score matching approach to implement diffusion models in the frequency domain.
arXiv Detail & Related papers (2024-02-08T18:59:05Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - CAMul: Calibrated and Accurate Multi-view Time-Series Forecasting [70.54920804222031]
We propose a general probabilistic multi-view forecasting framework CAMul.
It can learn representations and uncertainty from diverse data sources.
It integrates the knowledge and uncertainty from each data view in a dynamic context-specific manner.
We show that CAMul outperforms other state-of-art probabilistic forecasting models by over 25% in accuracy and calibration.
arXiv Detail & Related papers (2021-09-15T17:13:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.