RouteNet-Erlang: A Graph Neural Network for Network Performance
Evaluation
- URL: http://arxiv.org/abs/2202.13956v1
- Date: Mon, 28 Feb 2022 17:09:53 GMT
- Title: RouteNet-Erlang: A Graph Neural Network for Network Performance
Evaluation
- Authors: Miquel Ferriol-Galm\'es, Krzysztof Rusek, Jos\'e Su\'arez-Varela,
Shihan Xiao, Xiangle Cheng, Pere Barlet-Ros, Albert Cabellos-Aparicio
- Abstract summary: We present emphRouteNet-Erlang, a pioneering GNN architecture designed to model computer networks.
RouteNet-Erlang supports complex traffic models, multi-queue scheduling policies, routing policies and can provide accurate estimates.
We benchmark RouteNet-Erlang against a state-of-the-art QT model, and our results show that it outperforms QT in all the network scenarios.
- Score: 5.56275556529722
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Network modeling is a fundamental tool in network research, design, and
operation. Arguably the most popular method for modeling is Queuing Theory
(QT). Its main limitation is that it imposes strong assumptions on the packet
arrival process, which typically do not hold in real networks. In the field of
Deep Learning, Graph Neural Networks (GNN) have emerged as a new technique to
build data-driven models that can learn complex and non-linear behavior. In
this paper, we present \emph{RouteNet-Erlang}, a pioneering GNN architecture
designed to model computer networks. RouteNet-Erlang supports complex traffic
models, multi-queue scheduling policies, routing policies and can provide
accurate estimates in networks not seen in the training phase. We benchmark
RouteNet-Erlang against a state-of-the-art QT model, and our results show that
it outperforms QT in all the network scenarios.
Related papers
- TCCT-Net: Two-Stream Network Architecture for Fast and Efficient Engagement Estimation via Behavioral Feature Signals [58.865901821451295]
We present a novel two-stream feature fusion "Tensor-Convolution and Convolution-Transformer Network" (TCCT-Net) architecture.
To better learn the meaningful patterns in the temporal-spatial domain, we design a "CT" stream that integrates a hybrid convolutional-transformer.
In parallel, to efficiently extract rich patterns from the temporal-frequency domain, we introduce a "TC" stream that uses Continuous Wavelet Transform (CWT) to represent information in a 2D tensor form.
arXiv Detail & Related papers (2024-04-15T06:01:48Z) - Auto-Train-Once: Controller Network Guided Automatic Network Pruning from Scratch [72.26822499434446]
Auto-Train-Once (ATO) is an innovative network pruning algorithm designed to automatically reduce the computational and storage costs of DNNs.
We provide a comprehensive convergence analysis as well as extensive experiments, and the results show that our approach achieves state-of-the-art performance across various model architectures.
arXiv Detail & Related papers (2024-03-21T02:33:37Z) - Learning State-Augmented Policies for Information Routing in
Communication Networks [92.59624401684083]
We develop a novel State Augmentation (SA) strategy to maximize the aggregate information at source nodes using graph neural network (GNN) architectures.
We leverage an unsupervised learning procedure to convert the output of the GNN architecture to optimal information routing strategies.
In the experiments, we perform the evaluation on real-time network topologies to validate our algorithms.
arXiv Detail & Related papers (2023-09-30T04:34:25Z) - RouteNet-Fermi: Network Modeling with Graph Neural Networks [7.227467283378366]
We present RouteNet-Fermi, a custom Graph Neural Networks (GNN) model that shares the same goals as Queuing Theory.
The proposed model predicts accurately the delay, jitter, and packet loss of a network.
Our experimental results show that RouteNet-Fermi achieves similar accuracy as computationally-expensive packet-level simulators.
arXiv Detail & Related papers (2022-12-22T23:02:40Z) - Interference Cancellation GAN Framework for Dynamic Channels [74.22393885274728]
We introduce an online training framework that can adapt to any changes in the channel.
Our framework significantly outperforms recent neural network models on highly dynamic channels.
arXiv Detail & Related papers (2022-08-17T02:01:18Z) - Open World Learning Graph Convolution for Latency Estimation in Routing Networks [16.228327606985257]
We propose a novel approach for modeling network routing, using Graph Neural Networks.
Our model shares a stable performance across different network sizes and configurations of routing networks, while at the same time being able to extrapolate towards unseen sizes, configurations, and user behavior.
We show that our model outperforms most conventional deep-learning-based models, in terms of prediction accuracy, computational resources, inference speed, as well as ability to generalize towards open-world input.
arXiv Detail & Related papers (2022-07-08T19:26:40Z) - Simulating Network Paths with Recurrent Buffering Units [4.7590500506853415]
We seek a model that generates end-to-end packet delay values in response to the time-varying load offered by a sender.
We propose a novel grey-box approach to network simulation that embeds the semantics of physical network path in a new RNN-style architecture called Recurrent Buffering Unit.
arXiv Detail & Related papers (2022-02-23T16:46:31Z) - Packet Routing with Graph Attention Multi-agent Reinforcement Learning [4.78921052969006]
We develop a model-free and data-driven routing strategy by leveraging reinforcement learning (RL)
Considering the graph nature of the network topology, we design a multi-agent RL framework in combination with Graph Neural Network (GNN)
arXiv Detail & Related papers (2021-07-28T06:20:34Z) - Applying Graph-based Deep Learning To Realistic Network Scenarios [5.453745629140304]
This paper presents a new Graph-based deep learning model able to estimate accurately the per-path mean delay in networks.
The proposed model can generalize successfully over topologies, routing configurations, queue scheduling policies and traffic matrices unseen during the training phase.
arXiv Detail & Related papers (2020-10-13T20:58:59Z) - Dynamic Graph: Learning Instance-aware Connectivity for Neural Networks [78.65792427542672]
Dynamic Graph Network (DG-Net) is a complete directed acyclic graph, where the nodes represent convolutional blocks and the edges represent connection paths.
Instead of using the same path of the network, DG-Net aggregates features dynamically in each node, which allows the network to have more representation ability.
arXiv Detail & Related papers (2020-10-02T16:50:26Z) - Deep Learning for Ultra-Reliable and Low-Latency Communications in 6G
Networks [84.2155885234293]
We first summarize how to apply data-driven supervised deep learning and deep reinforcement learning in URLLC.
To address these open problems, we develop a multi-level architecture that enables device intelligence, edge intelligence, and cloud intelligence for URLLC.
arXiv Detail & Related papers (2020-02-22T14:38:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.