Reinforcement Learning of Graph Neural Networks for Service Function
Chaining
- URL: http://arxiv.org/abs/2011.08406v1
- Date: Tue, 17 Nov 2020 03:50:53 GMT
- Title: Reinforcement Learning of Graph Neural Networks for Service Function
Chaining
- Authors: DongNyeong Heo, Doyoung Lee, Hee-Gon Kim, Suhyun Park, Heeyoul Choi
- Abstract summary: Service function chaining (SFC) modules play an important role by generating efficient paths for network traffic through physical servers.
Previous supervised learning method demonstrated that the network features can be represented by graph neural networks (GNNs) for the SFC task.
In this paper, we apply reinforcement learning methods for training models on various network topologies with unlabeled data.
- Score: 3.9373541926236766
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the management of computer network systems, the service function chaining
(SFC) modules play an important role by generating efficient paths for network
traffic through physical servers with virtualized network functions (VNF). To
provide the highest quality of services, the SFC module should generate a valid
path quickly even in various network topology situations including dynamic VNF
resources, various requests, and changes of topologies. The previous supervised
learning method demonstrated that the network features can be represented by
graph neural networks (GNNs) for the SFC task. However, the performance was
limited to only the fixed topology with labeled data. In this paper, we apply
reinforcement learning methods for training models on various network
topologies with unlabeled data. In the experiments, compared to the previous
supervised learning method, the proposed methods demonstrated remarkable
flexibility in new topologies without re-designing and re-training, while
preserving a similar level of performance.
Related papers
- Local Kernel Renormalization as a mechanism for feature learning in
overparametrized Convolutional Neural Networks [0.0]
Empirical evidence shows that fully-connected neural networks in the infinite-width limit eventually outperform their finite-width counterparts.
State-of-the-art architectures with convolutional layers achieve optimal performances in the finite-width regime.
We show that the generalization performance of a finite-width FC network can be obtained by an infinite-width network, with a suitable choice of the Gaussian priors.
arXiv Detail & Related papers (2023-07-21T17:22:04Z) - Joint Feature and Differentiable $ k $-NN Graph Learning using Dirichlet
Energy [103.74640329539389]
We propose a deep FS method that simultaneously conducts feature selection and differentiable $ k $-NN graph learning.
We employ Optimal Transport theory to address the non-differentiability issue of learning $ k $-NN graphs in neural networks.
We validate the effectiveness of our model with extensive experiments on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-21T08:15:55Z) - Advanced Scaling Methods for VNF deployment with Reinforcement Learning [0.0]
Network function virtualization (NFV) and software-defined network (SDN) have become emerging network paradigms.
reinforcement learning (RL) based approaches have been proposed to optimize VNF deployment.
In this paper, we propose an enhanced model which can be adapted to more general network settings.
arXiv Detail & Related papers (2023-01-19T21:31:23Z) - SIRe-Networks: Skip Connections over Interlaced Multi-Task Learning and
Residual Connections for Structure Preserving Object Classification [28.02302915971059]
In this paper, we introduce an interlaced multi-task learning strategy, defined SIRe, to reduce the vanishing gradient in relation to the object classification task.
The presented methodology directly improves a convolutional neural network (CNN) by enforcing the input image structure preservation through auto-encoders.
To validate the presented methodology, a simple CNN and various implementations of famous networks are extended via the SIRe strategy and extensively tested on the CIFAR100 dataset.
arXiv Detail & Related papers (2021-10-06T13:54:49Z) - Sequential Deep Learning Architectures for Anomaly Detection in Virtual
Network Function Chains [0.0]
anomaly detection system (ADS) for virtual network functions in service function chains (SFCs)
We propose several sequential deep learning models to learn time-series patterns and sequential patterns of the virtual network functions (VNFs) in the chain with variable lengths.
arXiv Detail & Related papers (2021-09-29T08:47:57Z) - Joint Learning of Neural Transfer and Architecture Adaptation for Image
Recognition [77.95361323613147]
Current state-of-the-art visual recognition systems rely on pretraining a neural network on a large-scale dataset and finetuning the network weights on a smaller dataset.
In this work, we prove that dynamically adapting network architectures tailored for each domain task along with weight finetuning benefits in both efficiency and effectiveness.
Our method can be easily generalized to an unsupervised paradigm by replacing supernet training with self-supervised learning in the source domain tasks and performing linear evaluation in the downstream tasks.
arXiv Detail & Related papers (2021-03-31T08:15:17Z) - Local Critic Training for Model-Parallel Learning of Deep Neural
Networks [94.69202357137452]
We propose a novel model-parallel learning method, called local critic training.
We show that the proposed approach successfully decouples the update process of the layer groups for both convolutional neural networks (CNNs) and recurrent neural networks (RNNs)
We also show that trained networks by the proposed method can be used for structural optimization.
arXiv Detail & Related papers (2021-02-03T09:30:45Z) - Graph Neural Network based Service Function Chaining for Automatic
Network Control [0.4817429789586127]
Service function chaining (SFC) is an important technology to find efficient paths in network servers.
We propose a new neural network architecture for SFC, which is based on graph neural network (GNN)
arXiv Detail & Related papers (2020-09-11T06:01:27Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Neural networks adapting to datasets: learning network size and topology [77.34726150561087]
We introduce a flexible setup allowing for a neural network to learn both its size and topology during the course of a gradient-based training.
The resulting network has the structure of a graph tailored to the particular learning task and dataset.
arXiv Detail & Related papers (2020-06-22T12:46:44Z) - Progressive Graph Convolutional Networks for Semi-Supervised Node
Classification [97.14064057840089]
Graph convolutional networks have been successful in addressing graph-based tasks such as semi-supervised node classification.
We propose a method to automatically build compact and task-specific graph convolutional networks.
arXiv Detail & Related papers (2020-03-27T08:32:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.