Neural Network Tomography
- URL: http://arxiv.org/abs/2001.02942v1
- Date: Thu, 9 Jan 2020 12:19:26 GMT
- Title: Neural Network Tomography
- Authors: Liang Ma and Ziyao Zhang and Mudhakar Srivatsa
- Abstract summary: Network tomography is a classic research problem in the realm of network monitoring.
NeuTomography utilizes deep neural network and data augmentation to predict the unmeasured performance metrics.
NeuTomography can be employed to reconstruct the original network topology.
- Score: 4.407668482702675
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Network tomography, a classic research problem in the realm of network
monitoring, refers to the methodology of inferring unmeasured network
attributes using selected end-to-end path measurements. In the research
community, network tomography is generally investigated under the assumptions
of known network topology, correlated path measurements, bounded number of
faulty nodes/links, or even special network protocol support. The applicability
of network tomography is considerably constrained by these strong assumptions,
which therefore frequently position it in the theoretical world. In this
regard, we revisit network tomography from the practical perspective by
establishing a generic framework that does not rely on any of these assumptions
or the types of performance metrics. Given only the end-to-end path performance
metrics of sampled node pairs, the proposed framework, NeuTomography, utilizes
deep neural network and data augmentation to predict the unmeasured performance
metrics via learning non-linear relationships between node pairs and underlying
unknown topological/routing properties. In addition, NeuTomography can be
employed to reconstruct the original network topology, which is critical to
most network planning tasks. Extensive experiments using real network data show
that comparing to baseline solutions, NeuTomography can predict network
characteristics and reconstruct network topologies with significantly higher
accuracy and robustness using only limited measurement data.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Addressing caveats of neural persistence with deep graph persistence [54.424983583720675]
We find that the variance of network weights and spatial concentration of large weights are the main factors that impact neural persistence.
We propose an extension of the filtration underlying neural persistence to the whole neural network instead of single layers.
This yields our deep graph persistence measure, which implicitly incorporates persistent paths through the network and alleviates variance-related issues.
arXiv Detail & Related papers (2023-07-20T13:34:11Z) - BS-GAT Behavior Similarity Based Graph Attention Network for Network
Intrusion Detection [20.287285893803244]
This paper proposes a graph neural network algorithm based on behavior similarity (BS-GAT) using graph attention network.
The results show that the proposed method is effective and has superior performance comparing to existing solutions.
arXiv Detail & Related papers (2023-04-07T09:42:07Z) - Classification of vertices on social networks by multiple approaches [1.370151489527964]
In the case of social networks, it is crucial to evaluate the labels of discrete communities.
For each of these interaction-based entities, a social graph, a mailing dataset, and two citation sets are selected as the testbench repositories.
This paper was not only assessed the most valuable method but also determined how graph neural networks work.
arXiv Detail & Related papers (2023-01-13T09:42:55Z) - Quasi-orthogonality and intrinsic dimensions as measures of learning and
generalisation [55.80128181112308]
We show that dimensionality and quasi-orthogonality of neural networks' feature space may jointly serve as network's performance discriminants.
Our findings suggest important relationships between the networks' final performance and properties of their randomly initialised feature spaces.
arXiv Detail & Related papers (2022-03-30T21:47:32Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Topological obstructions in neural networks learning [67.8848058842671]
We study global properties of the loss gradient function flow.
We use topological data analysis of the loss function and its Morse complex to relate local behavior along gradient trajectories with global properties of the loss surface.
arXiv Detail & Related papers (2020-12-31T18:53:25Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.