LogGPT: Log Anomaly Detection via GPT
- URL: http://arxiv.org/abs/2309.14482v2
- Date: Mon, 11 Dec 2023 05:50:19 GMT
- Title: LogGPT: Log Anomaly Detection via GPT
- Authors: Xiao Han, Shuhan Yuan, Mohamed Trabelsi
- Abstract summary: We propose LogGPT, a novel framework that employs GPT for log anomaly detection.
LogGPT is first trained to predict the next log entry based on the preceding sequence.
We propose a novel reinforcement learning strategy to finetune the model specifically for the log anomaly detection task.
- Score: 15.790373280124196
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Detecting system anomalies based on log data is important for ensuring the
security and reliability of computer systems. Recently, deep learning models
have been widely used for log anomaly detection. The core idea is to model the
log sequences as natural language and adopt deep sequential models, such as
LSTM or Transformer, to encode the normal patterns in log sequences via
language modeling. However, there is a gap between language modeling and
anomaly detection as the objective of training a sequential model via a
language modeling loss is not directly related to anomaly detection. To fill up
the gap, we propose LogGPT, a novel framework that employs GPT for log anomaly
detection. LogGPT is first trained to predict the next log entry based on the
preceding sequence. To further enhance the performance of LogGPT, we propose a
novel reinforcement learning strategy to finetune the model specifically for
the log anomaly detection task. The experimental results on three datasets show
that LogGPT significantly outperforms existing state-of-the-art approaches.
Related papers
- LogLLM: Log-based Anomaly Detection Using Large Language Models [8.03646578793411]
We propose LogLLM, a log-based anomaly detection framework that leverages large language models (LLMs)
LogLLM employs BERT for extracting semantic vectors from log messages, while utilizing Llama, a transformer decoder-based model, for classifying log sequences.
Our framework is trained through a novel three-stage procedure designed to enhance performance and adaptability.
arXiv Detail & Related papers (2024-11-13T12:18:00Z) - LogELECTRA: Self-supervised Anomaly Detection for Unstructured Logs [0.0]
The goal of log-based anomaly detection is to automatically detect system anomalies by analyzing the large number of logs generated in a short period of time.
Previous studies have used a log to extract templates from unstructured log data and detect anomalies on the basis of patterns of the template occurrences.
We propose LogELECTRA, a new log anomaly detection model that analyzes a single line of log messages more deeply on the basis of self-supervised anomaly detection.
arXiv Detail & Related papers (2024-02-16T01:47:02Z) - LogFormer: A Pre-train and Tuning Pipeline for Log Anomaly Detection [73.69399219776315]
We propose a unified Transformer-based framework for Log anomaly detection (LogFormer) to improve the generalization ability across different domains.
Specifically, our model is first pre-trained on the source domain to obtain shared semantic knowledge of log data.
Then, we transfer such knowledge to the target domain via shared parameters.
arXiv Detail & Related papers (2024-01-09T12:55:21Z) - GLAD: Content-aware Dynamic Graphs For Log Anomaly Detection [49.9884374409624]
GLAD is a Graph-based Log Anomaly Detection framework designed to detect anomalies in system logs.
We introduce GLAD, a Graph-based Log Anomaly Detection framework designed to detect anomalies in system logs.
arXiv Detail & Related papers (2023-09-12T04:21:30Z) - LogGPT: Exploring ChatGPT for Log-Based Anomaly Detection [35.48151798946824]
We propose LogGPT, a log-based anomaly detection framework based on ChatGPT.
By leveraging the ChatGPT's language interpretation capabilities, LogGPT aims to explore the transferability of knowledge from large-scale corpora to log-based anomaly detection.
We conduct experiments to evaluate the performance of LogGPT and compare it with three deep learning-based methods on BGL and Spirit datasets.
arXiv Detail & Related papers (2023-09-03T14:22:57Z) - PULL: Reactive Log Anomaly Detection Based On Iterative PU Learning [58.85063149619348]
We propose PULL, an iterative log analysis method for reactive anomaly detection based on estimated failure time windows.
Our evaluation shows that PULL consistently outperforms ten benchmark baselines across three different datasets.
arXiv Detail & Related papers (2023-01-25T16:34:43Z) - LogGD:Detecting Anomalies from System Logs by Graph Neural Networks [14.813971618949068]
We propose a novel graph-based log anomaly detection method, LogGD, to effectively address the issue.
We exploit the powerful capability of Graph Transformer Neural Network, which combines graph structure and node semantics for log-based anomaly detection.
arXiv Detail & Related papers (2022-09-16T11:51:58Z) - LAnoBERT: System Log Anomaly Detection based on BERT Masked Language
Model [12.00171674362062]
The aim of system log anomaly detection is to promptly identify anomalies while minimizing human intervention.
Previous studies performed anomaly detection through algorithms after converting various forms of log data into a standardized template.
In this study, we propose LAnoBERT, exhibiting excellent natural language processing performance.
arXiv Detail & Related papers (2021-11-18T07:46:35Z) - Robust and Transferable Anomaly Detection in Log Data using Pre-Trained
Language Models [59.04636530383049]
Anomalies or failures in large computer systems, such as the cloud, have an impact on a large number of users.
We propose a framework for anomaly detection in log data, as a major troubleshooting source of system information.
arXiv Detail & Related papers (2021-02-23T09:17:05Z) - Self-Attentive Classification-Based Anomaly Detection in Unstructured
Logs [59.04636530383049]
We propose Logsy, a classification-based method to learn log representations.
We show an average improvement of 0.25 in the F1 score, compared to the previous methods.
arXiv Detail & Related papers (2020-08-21T07:26:55Z) - Self-Supervised Log Parsing [59.04636530383049]
Large-scale software systems generate massive volumes of semi-structured log records.
Existing approaches rely on log-specifics or manual rule extraction.
We propose NuLog that utilizes a self-supervised learning model and formulates the parsing task as masked language modeling.
arXiv Detail & Related papers (2020-03-17T19:25:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.