Scaling Graph-based Deep Learning models to larger networks
- URL: http://arxiv.org/abs/2110.01261v1
- Date: Mon, 4 Oct 2021 09:04:19 GMT
- Title: Scaling Graph-based Deep Learning models to larger networks
- Authors: Miquel Ferriol-Galm\'es, Jos\'e Su\'arez-Varela, Krzysztof Rusek, Pere
Barlet-Ros, Albert Cabellos-Aparicio
- Abstract summary: Graph Neural Networks (GNN) have shown a strong potential to be integrated into commercial products for network control and management.
This paper presents a GNN-based solution that can effectively scale to larger networks including higher link capacities and aggregated traffic on links.
- Score: 2.946140899052065
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNN) have shown a strong potential to be integrated
into commercial products for network control and management. Early works using
GNN have demonstrated an unprecedented capability to learn from different
network characteristics that are fundamentally represented as graphs, such as
the topology, the routing configuration, or the traffic that flows along a
series of nodes in the network. In contrast to previous solutions based on
Machine Learning (ML), GNN enables to produce accurate predictions even in
other networks unseen during the training phase. Nowadays, GNN is a hot topic
in the Machine Learning field and, as such, we are witnessing great efforts to
leverage its potential in many different fields (e.g., chemistry, physics,
social networks). In this context, the Graph Neural Networking challenge 2021
brings a practical limitation of existing GNN-based solutions for networking:
the lack of generalization to larger networks. This paper approaches the
scalability problem by presenting a GNN-based solution that can effectively
scale to larger networks including higher link capacities and aggregated
traffic on links.
Related papers
- LinkSAGE: Optimizing Job Matching Using Graph Neural Networks [12.088731514483104]
We present LinkSAGE, an innovative framework that integrates Graph Neural Networks (GNNs) into large-scale personalized job matching systems.
Our approach capitalizes on a novel job marketplace graph, the largest and most intricate of its kind in industry, with billions of nodes and edges.
A key innovation in LinkSAGE is its training and serving methodology, which effectively combines inductive graph learning on a heterogeneous, evolving graph with an encoder-decoder GNN model.
arXiv Detail & Related papers (2024-02-20T23:49:25Z) - Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - Graph Neural Networks for Communication Networks: Context, Use Cases and
Opportunities [4.4568884144849985]
Graph neural networks (GNNs) have shown outstanding applications in many fields where data is fundamentally represented as graphs.
GNNs represent a new generation of data-driven models that can accurately learn and reproduce the complex behaviors behind real networks.
This article comprises a brief tutorial on GNNs and their possible applications to communication networks.
arXiv Detail & Related papers (2021-12-29T19:09:42Z) - mGNN: Generalizing the Graph Neural Networks to the Multilayer Case [0.0]
We propose mGNN, a framework meant to generalize GNNs to multi-layer networks.
Our approach is general (i.e., not task specific) and has the advantage of extending any type of GNN without any computational overhead.
We test the framework into three different tasks (node and network classification, link prediction) to validate it.
arXiv Detail & Related papers (2021-09-21T12:02:12Z) - IGNNITION: Bridging the Gap Between Graph Neural Networks and Networking
Systems [4.1591055164123665]
We present IGNNITION, a novel open-source framework that enables fast prototyping of Graph Neural Networks (GNNs) for networking systems.
IGNNITION is based on an intuitive high-level abstraction that hides the complexity behind GNNs.
Our results show that the GNN models produced by IGNNITION are equivalent in terms of accuracy and performance to their native implementations.
arXiv Detail & Related papers (2021-09-14T14:28:21Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z) - Analyzing the Performance of Graph Neural Networks with Pipe Parallelism [2.269587850533721]
We focus on Graph Neural Networks (GNNs) that have found great success in tasks such as node or edge classification and link prediction.
New approaches for processing larger networks are needed to advance graph techniques.
We study how GNNs could be parallelized using existing tools and frameworks that are known to be successful in the deep learning community.
arXiv Detail & Related papers (2020-12-20T04:20:38Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Attentive Graph Neural Networks for Few-Shot Learning [74.01069516079379]
Graph Neural Networks (GNN) has demonstrated the superior performance in many challenging applications, including the few-shot learning tasks.
Despite its powerful capacity to learn and generalize the model from few samples, GNN usually suffers from severe over-fitting and over-smoothing as the model becomes deep.
We propose a novel Attentive GNN to tackle these challenges, by incorporating a triple-attention mechanism.
arXiv Detail & Related papers (2020-07-14T07:43:09Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.