Graph Neural Network based Service Function Chaining for Automatic
Network Control
- URL: http://arxiv.org/abs/2009.05240v1
- Date: Fri, 11 Sep 2020 06:01:27 GMT
- Title: Graph Neural Network based Service Function Chaining for Automatic
Network Control
- Authors: DongNyeong Heo, Stanislav Lange, Hee-Gon Kim and Heeyoul Choi
- Abstract summary: Service function chaining (SFC) is an important technology to find efficient paths in network servers.
We propose a new neural network architecture for SFC, which is based on graph neural network (GNN)
- Score: 0.4817429789586127
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Software-defined networking (SDN) and the network function virtualization
(NFV) led to great developments in software based control technology by
decreasing expenditures. Service function chaining (SFC) is an important
technology to find efficient paths in network servers to process all of the
requested virtualized network functions (VNF). However, SFC is challenging
since it has to maintain high Quality of Service (QoS) even for complicated
situations. Although some works have been conducted for such tasks with
high-level intelligent models like deep neural networks (DNNs), those
approaches are not efficient in utilizing the topology information of networks
and cannot be applied to networks with dynamically changing topology since
their models assume that the topology is fixed. In this paper, we propose a new
neural network architecture for SFC, which is based on graph neural network
(GNN) considering the graph-structured properties of network topology. The
proposed SFC model consists of an encoder and a decoder, where the encoder
finds the representation of the network topology, and then the decoder
estimates probabilities of neighborhood nodes and their probabilities to
process a VNF. In the experiments, our proposed architecture outperformed
previous performances of DNN based baseline model. Moreover, the GNN based
model can be applied to a new network topology without re-designing and
re-training.
Related papers
- Graph Metanetworks for Processing Diverse Neural Architectures [33.686728709734105]
Graph Metanetworks (GMNs) generalizes to neural architectures where competing methods struggle.
We prove that GMNs are expressive and equivariant to parameter permutation symmetries that leave the input neural network functions.
arXiv Detail & Related papers (2023-12-07T18:21:52Z) - Equivariant Matrix Function Neural Networks [1.8717045355288808]
We introduce Matrix Function Neural Networks (MFNs), a novel architecture that parameterizes non-local interactions through analytic matrix equivariant functions.
MFNs is able to capture intricate non-local interactions in quantum systems, paving the way to new state-of-the-art force fields.
arXiv Detail & Related papers (2023-10-16T14:17:00Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Generalization and Estimation Error Bounds for Model-based Neural
Networks [78.88759757988761]
We show that the generalization abilities of model-based networks for sparse recovery outperform those of regular ReLU networks.
We derive practical design rules that allow to construct model-based networks with guaranteed high generalization.
arXiv Detail & Related papers (2023-04-19T16:39:44Z) - Vanilla Feedforward Neural Networks as a Discretization of Dynamical Systems [9.382423715831687]
In this paper, we back to the classical network structure and prove that the vanilla feedforward networks could also be a numerical discretization of dynamic systems.
Our results could provide a new perspective for understanding the approximation properties of feedforward neural networks.
arXiv Detail & Related papers (2022-09-22T10:32:08Z) - Interference Cancellation GAN Framework for Dynamic Channels [74.22393885274728]
We introduce an online training framework that can adapt to any changes in the channel.
Our framework significantly outperforms recent neural network models on highly dynamic channels.
arXiv Detail & Related papers (2022-08-17T02:01:18Z) - Scaling Graph-based Deep Learning models to larger networks [2.946140899052065]
Graph Neural Networks (GNN) have shown a strong potential to be integrated into commercial products for network control and management.
This paper presents a GNN-based solution that can effectively scale to larger networks including higher link capacities and aggregated traffic on links.
arXiv Detail & Related papers (2021-10-04T09:04:19Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - Reinforcement Learning of Graph Neural Networks for Service Function
Chaining [3.9373541926236766]
Service function chaining (SFC) modules play an important role by generating efficient paths for network traffic through physical servers.
Previous supervised learning method demonstrated that the network features can be represented by graph neural networks (GNNs) for the SFC task.
In this paper, we apply reinforcement learning methods for training models on various network topologies with unlabeled data.
arXiv Detail & Related papers (2020-11-17T03:50:53Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Network Adjustment: Channel Search Guided by FLOPs Utilization Ratio [101.84651388520584]
This paper presents a new framework named network adjustment, which considers network accuracy as a function of FLOPs.
Experiments on standard image classification datasets and a wide range of base networks demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-06T15:51:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.