SPEAR: Soft Prompt Enhanced Anomaly Recognition for Time Series Data
- URL: http://arxiv.org/abs/2510.03962v1
- Date: Sat, 04 Oct 2025 22:20:48 GMT
- Title: SPEAR: Soft Prompt Enhanced Anomaly Recognition for Time Series Data
- Authors: Hanzhe Wei, Jiajun Wu, Jialin Yang, Henry Leung, Steve Drew,
- Abstract summary: Time series anomaly detection plays a crucial role in a wide range of fields, such as healthcare and internet traffic monitoring.<n>The emergence of large language models (LLMs) offers new opportunities for detecting anomalies in the ubiquitous time series data.<n>We propose Soft Prompt Enhanced Anomaly Recognition (SPEAR), a novel approach to leverage LLMs for anomaly detection with soft prompts and quantization.
- Score: 15.670885465827206
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series anomaly detection plays a crucial role in a wide range of fields, such as healthcare and internet traffic monitoring. The emergence of large language models (LLMs) offers new opportunities for detecting anomalies in the ubiquitous time series data. Traditional approaches struggle with variable-length time series sequences and context-based anomalies. We propose Soft Prompt Enhanced Anomaly Recognition (SPEAR), a novel approach to leverage LLMs for anomaly detection with soft prompts and quantization. Our methodology involves quantizing and transforming the time series data into input embeddings and combining them with learnable soft prompt embeddings. These combined embeddings are then fed into a frozen LLM. The soft prompts are updated iteratively based on a cross-entropy loss, allowing the model to adapt to time series anomaly detection. The use of soft prompts helps adapt LLMs effectively to time series tasks, while quantization ensures optimal handling of sequences, as LLMs are designed to handle discrete sequences. Our experimental results demonstrate that soft prompts effectively increase LLMs' performance in downstream tasks regarding time series anomaly detection.
Related papers
- Contextual and Seasonal LSTMs for Time Series Anomaly Detection [49.50689313712684]
We propose a novel prediction-based framework named Contextual and Seasonal LSTMs (CS-LSTMs)<n>CS-LSTMs are built upon a noise decomposition strategy and jointly leverage contextual dependencies and seasonal patterns.<n>They consistently outperform state-of-the-art methods, highlighting their effectiveness and practical value in robust time series anomaly detection.
arXiv Detail & Related papers (2026-02-10T11:46:15Z) - LLM-Enhanced Reinforcement Learning for Time Series Anomaly Detection [1.1852406625172216]
Time series anomaly detection often suffers from sparse labels, complex temporal patterns, and costly expert annotation.<n>We propose a unified framework that integrates Large Language Model (LLM)-based potential functions for reward shaping with Reinforcement Learning (RL), Variational Autoencoder (VAE)-enhanced dynamic reward scaling, and active learning with label propagation.
arXiv Detail & Related papers (2026-01-05T19:33:30Z) - AXIS: Explainable Time Series Anomaly Detection with Large Language Models [33.68487894996624]
AXIS is a framework that conditions a frozen Large Language Models (LLMs) for nuanced time-series understanding.<n>LLMs operate on discrete tokens and struggle to directly process long, continuous signals.<n>We introduce a new benchmark featuring multi-format questions and rationales that supervise contextual grounding and pattern-level semantics.
arXiv Detail & Related papers (2025-09-29T07:24:22Z) - Semantic-Enhanced Time-Series Forecasting via Large Language Models [20.383296465541758]
Time series forecasting plays a significant role in finance, energy, meteorology, and IoT applications.<n>Recent studies have leveraged the generalization capabilities of large language models (LLMs) to adapt to time series forecasting, achieving promising performance.<n>We propose a novel Semantic-Enhanced LLM (SE-LLM) that explores the inherent periodicity and anomalous characteristics of time series to embed into the semantic space.
arXiv Detail & Related papers (2025-08-11T07:19:21Z) - Time-RA: Towards Time Series Reasoning for Anomaly with LLM Feedback [55.284574165467525]
Time-series Reasoning for Anomaly (Time-RA) transforms classical time series anomaly detection into a generative, reasoning-intensive task.<n>Also, we introduce the first real-world multimodal benchmark dataset, RATs40K, explicitly annotated for anomaly reasoning.
arXiv Detail & Related papers (2025-07-20T18:02:50Z) - Forecasting Time Series with LLMs via Patch-Based Prompting and Decomposition [48.50019311384125]
We explore simple and flexible prompt-based strategies that enable LLMs to perform time series forecasting without extensive retraining.<n>We propose our own method, PatchInstruct, which enables LLMs to make precise and effective predictions.
arXiv Detail & Related papers (2025-06-15T19:42:58Z) - LLM-PS: Empowering Large Language Models for Time Series Forecasting with Temporal Patterns and Semantics [56.99021951927683]
Time Series Forecasting (TSF) is critical in many real-world domains like financial planning and health monitoring.<n>Existing Large Language Models (LLMs) usually perform suboptimally because they neglect the inherent characteristics of time series data.<n>We propose LLM-PS to empower the LLM for TSF by learning the fundamental textitPatterns and meaningful textitSemantics from time series data.
arXiv Detail & Related papers (2025-03-12T11:45:11Z) - Can Multimodal LLMs Perform Time Series Anomaly Detection? [55.534264764673296]
We propose VisualTimeAnomaly benchmark to evaluate MLLMs in time series anomaly detection (TSAD)<n>Our approach transforms time series numerical data into the image format and feed these images into various MLLMs.<n>In total, VisualTimeAnomaly contains 12.4k time series images spanning 3 scenarios and 3 anomaly granularities with 9 anomaly types across 8 MLLMs.
arXiv Detail & Related papers (2025-02-25T03:37:43Z) - Hierarchical Multimodal LLMs with Semantic Space Alignment for Enhanced Time Series Classification [4.5939667818289385]
HiTime is a hierarchical multi-modal model that seamlessly integrates temporal information into large language models.
Our findings highlight the potential of integrating temporal features into LLMs, paving the way for advanced time series analysis.
arXiv Detail & Related papers (2024-10-24T12:32:19Z) - Can LLMs Serve As Time Series Anomaly Detectors? [33.28502093260832]
An emerging topic in large language models (LLMs) is their application to time series forecasting.
In this paper, we investigate the capabilities of LLMs, specifically GPT-4 and LLaMA3, in detecting and explaining anomalies in time series.
arXiv Detail & Related papers (2024-08-06T23:14:39Z) - Time-LLM: Time Series Forecasting by Reprogramming Large Language Models [110.20279343734548]
Time series forecasting holds significant importance in many real-world dynamic systems.
We present Time-LLM, a reprogramming framework to repurpose large language models for time series forecasting.
Time-LLM is a powerful time series learner that outperforms state-of-the-art, specialized forecasting models.
arXiv Detail & Related papers (2023-10-03T01:31:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.