Lens: A Foundation Model for Network Traffic
- URL: http://arxiv.org/abs/2402.03646v4
- Date: Fri, 27 Sep 2024 15:30:40 GMT
- Title: Lens: A Foundation Model for Network Traffic
- Authors: Qineng Wang, Chen Qian, Xiaochang Li, Ziyu Yao, Gang Zhou, Huajie Shao,
- Abstract summary: Lens is a foundation model for network traffic that leverages the T5 architecture to learn the pre-trained representations from large-scale unlabeled data.
We design a novel loss that combines three distinct tasks: Masked Span Prediction (MSP), Packet Order Prediction (POP), and Homologous Traffic Prediction (HTP)
- Score: 19.3652490585798
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Network traffic refers to the amount of data being sent and received over the internet or any system that connects computers. Analyzing and understanding network traffic is vital for improving network security and management. However, the analysis of network traffic is challenging due to the diverse nature of data packets, which often feature heterogeneous headers and encrypted payloads lacking semantics. To capture the latent semantics of traffic, a few studies have adopted pre-training techniques based on the Transformer encoder or decoder to learn the representations from massive traffic data. However, these methods typically excel in traffic understanding (classification) or traffic generation tasks. To address this issue, we develop Lens, a foundation model for network traffic that leverages the T5 architecture to learn the pre-trained representations from large-scale unlabeled data. Harnessing the strength of the encoder-decoder framework, which captures the global information while preserving the generative ability, our model can better learn the representations from raw data. To further enhance pre-training effectiveness, we design a novel loss that combines three distinct tasks: Masked Span Prediction (MSP), Packet Order Prediction (POP), and Homologous Traffic Prediction (HTP). Evaluation results across various benchmark datasets demonstrate that the proposed Lens outperforms the baselines in most downstream tasks related to both traffic understanding and generation. Notably, it also requires much less labeled data for fine-tuning compared to current methods.
Related papers
- Deep Learning-driven Mobile Traffic Measurement Collection and Analysis [0.43512163406552007]
In this thesis, we harness the powerful hierarchical feature learning abilities of Deep Learning (DL) techniques in both spatial and temporal domains.
We develop solutions for precise city-scale mobile traffic analysis and forecasting.
arXiv Detail & Related papers (2024-10-14T06:53:45Z) - BjTT: A Large-scale Multimodal Dataset for Traffic Prediction [49.93028461584377]
Traditional traffic prediction methods rely on historical traffic data to predict traffic trends.
In this work, we explore how generative models combined with text describing the traffic system can be applied for traffic generation.
We propose ChatTraffic, the first diffusion model for text-to-traffic generation.
arXiv Detail & Related papers (2024-03-08T04:19:56Z) - Many or Few Samples? Comparing Transfer, Contrastive and Meta-Learning
in Encrypted Traffic Classification [68.19713459228369]
We compare transfer learning, meta-learning and contrastive learning against reference Machine Learning (ML) tree-based and monolithic DL models.
We show that (i) using large datasets we can obtain more general representations, (ii) contrastive learning is the best methodology.
While ML tree-based cannot handle large tasks but fits well small tasks, by means of reusing learned representations, DL methods are reaching tree-based models performance also for small tasks.
arXiv Detail & Related papers (2023-05-21T11:20:49Z) - NetGPT: Generative Pretrained Transformer for Network Traffic [4.205009931131087]
Pretrained models for network traffic can utilize large-scale raw data to learn the essential characteristics of network traffic.
In this paper, we make the first attempt to provide a generative pretrained model NetGPT for both traffic understanding and generation tasks.
arXiv Detail & Related papers (2023-04-19T09:04:30Z) - TraffNet: Learning Causality of Traffic Generation for What-if Prediction [4.604622556490027]
Real-time what-if traffic prediction is crucial for decision making in intelligent traffic management and control.
Here, we present a simple deep learning framework called TraffNet that learns the mechanisms of traffic generation for what-if pre-diction.
arXiv Detail & Related papers (2023-03-28T13:12:17Z) - Efficient Federated Learning with Spike Neural Networks for Traffic Sign
Recognition [70.306089187104]
We introduce powerful Spike Neural Networks (SNNs) into traffic sign recognition for energy-efficient and fast model training.
Numerical results indicate that the proposed federated SNN outperforms traditional federated convolutional neural networks in terms of accuracy, noise immunity, and energy efficiency as well.
arXiv Detail & Related papers (2022-05-28T03:11:48Z) - Auto-Transfer: Learning to Route Transferrable Representations [77.30427535329571]
We propose a novel adversarial multi-armed bandit approach which automatically learns to route source representations to appropriate target representations.
We see upwards of 5% accuracy improvements compared with the state-of-the-art knowledge transfer methods.
arXiv Detail & Related papers (2022-02-02T13:09:27Z) - Road Network Guided Fine-Grained Urban Traffic Flow Inference [108.64631590347352]
Accurate inference of fine-grained traffic flow from coarse-grained one is an emerging yet crucial problem.
We propose a novel Road-Aware Traffic Flow Magnifier (RATFM) that exploits the prior knowledge of road networks.
Our method can generate high-quality fine-grained traffic flow maps.
arXiv Detail & Related papers (2021-09-29T07:51:49Z) - TrafficStream: A Streaming Traffic Flow Forecasting Framework Based on
Graph Neural Networks and Continual Learning [10.205873494981633]
We propose a Streaming Traffic Flow Forecasting Framework, TrafficStream, based on Graph Neural Networks (GNNs) and Continual Learning (CL)
A JS-divergence-based algorithm is proposed to mine new traffic patterns.
We construct a streaming traffic dataset to verify the efficiency and effectiveness of our model.
arXiv Detail & Related papers (2021-06-11T09:42:37Z) - Learning dynamic and hierarchical traffic spatiotemporal features with
Transformer [4.506591024152763]
This paper proposes a novel model, Traffic Transformer, for spatial-temporal graph modeling and long-term traffic forecasting.
Transformer is the most popular framework in Natural Language Processing (NLP)
analyzing the attention weight matrixes can find the influential part of road networks, allowing us to learn the traffic networks better.
arXiv Detail & Related papers (2021-04-12T02:29:58Z) - Pre-Trained Models for Heterogeneous Information Networks [57.78194356302626]
We propose a self-supervised pre-training and fine-tuning framework, PF-HIN, to capture the features of a heterogeneous information network.
PF-HIN consistently and significantly outperforms state-of-the-art alternatives on each of these tasks, on four datasets.
arXiv Detail & Related papers (2020-07-07T03:36:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.