How Low Can You Go? Searching for the Intrinsic Dimensionality of Complex Networks using Metric Node Embeddings
- URL: http://arxiv.org/abs/2503.01723v1
- Date: Mon, 03 Mar 2025 16:37:38 GMT
- Title: How Low Can You Go? Searching for the Intrinsic Dimensionality of Complex Networks using Metric Node Embeddings
- Authors: Nikolaos Nakis, Niels Raunkjær Holm, Andreas Lyhne Fiehn, Morten Mørup,
- Abstract summary: Low-dimensional embeddings are essential for machine learning tasks involving graphs.<n>We prove that lower dimensional embeddings are possible when using Euclidean metric embeddings.<n>For the first time, we demonstrate that even large-scale networks can be effectively embedded in very low-dimensional spaces.
- Score: 1.7061868168035932
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Low-dimensional embeddings are essential for machine learning tasks involving graphs, such as node classification, link prediction, community detection, network visualization, and network compression. Although recent studies have identified exact low-dimensional embeddings, the limits of the required embedding dimensions remain unclear. We presently prove that lower dimensional embeddings are possible when using Euclidean metric embeddings as opposed to vector-based Logistic PCA (LPCA) embeddings. In particular, we provide an efficient logarithmic search procedure for identifying the exact embedding dimension and demonstrate how metric embeddings enable inference of the exact embedding dimensions of large-scale networks by exploiting that the metric properties can be used to provide linearithmic scaling. Empirically, we show that our approach extracts substantially lower dimensional representations of networks than previously reported for small-sized networks. For the first time, we demonstrate that even large-scale networks can be effectively embedded in very low-dimensional spaces, and provide examples of scalable, exact reconstruction for graphs with up to a million nodes. Our approach highlights that the intrinsic dimensionality of networks is substantially lower than previously reported and provides a computationally efficient assessment of the exact embedding dimension also of large-scale networks. The surprisingly low dimensional representations achieved demonstrate that networks in general can be losslessly represented using very low dimensional feature spaces, which can be used to guide existing network analysis tasks from community detection and node classification to structure revealing exact network visualizations.
Related papers
- Quasi-orthogonality and intrinsic dimensions as measures of learning and
generalisation [55.80128181112308]
We show that dimensionality and quasi-orthogonality of neural networks' feature space may jointly serve as network's performance discriminants.
Our findings suggest important relationships between the networks' final performance and properties of their randomly initialised feature spaces.
arXiv Detail & Related papers (2022-03-30T21:47:32Z) - Community detection using low-dimensional network embedding algorithms [1.052782170493037]
We rigorously understand the performance of two major algorithms, DeepWalk and node2vec, in recovering communities for canonical network models.
We prove that, given some fixed co-occurrence window, node2vec using random walks with a low non-backtracking probability can succeed for much sparser networks.
arXiv Detail & Related papers (2021-11-04T14:57:43Z) - Dive into Layers: Neural Network Capacity Bounding using Algebraic
Geometry [55.57953219617467]
We show that the learnability of a neural network is directly related to its size.
We use Betti numbers to measure the topological geometric complexity of input data and the neural network.
We perform the experiments on a real-world dataset MNIST and the results verify our analysis and conclusion.
arXiv Detail & Related papers (2021-09-03T11:45:51Z) - Semi-supervised Network Embedding with Differentiable Deep Quantisation [81.49184987430333]
We develop d-SNEQ, a differentiable quantisation method for network embedding.
d-SNEQ incorporates a rank loss to equip the learned quantisation codes with rich high-order information.
It is able to substantially compress the size of trained embeddings, thus reducing storage footprint and accelerating retrieval speed.
arXiv Detail & Related papers (2021-08-20T11:53:05Z) - Inductive Graph Embeddings through Locality Encodings [0.42970700836450487]
We look at the problem of finding inductive network embeddings in large networks without domain-dependent node/edge attributes.
We propose to use a set of basic predefined local encodings as the basis of a learning algorithm.
This method achieves state-of-the-art performance in tasks such as role detection, link prediction and node classification.
arXiv Detail & Related papers (2020-09-26T13:09:11Z) - ESPN: Extremely Sparse Pruned Networks [50.436905934791035]
We show that a simple iterative mask discovery method can achieve state-of-the-art compression of very deep networks.
Our algorithm represents a hybrid approach between single shot network pruning methods and Lottery-Ticket type approaches.
arXiv Detail & Related papers (2020-06-28T23:09:27Z) - High-dimensional Convolutional Networks for Geometric Pattern
Recognition [75.43345656210992]
We present high-dimensional convolutional networks (ConvNets) for pattern recognition problems.
We first study the effectiveness of convolutional networks in detecting linear subspaces in high-dimensional spaces with up to 32 dimensions.
We then apply high-dimensional ConvNets to 3D registration under rigid motions and image correspondence estimation.
arXiv Detail & Related papers (2020-05-17T01:46:12Z) - BiDet: An Efficient Binarized Object Detector [96.19708396510894]
We propose a binarized neural network learning method called BiDet for efficient object detection.
Our BiDet fully utilizes the representational capacity of the binary neural networks for object detection by redundancy removal.
Our method outperforms the state-of-the-art binary neural networks by a sizable margin.
arXiv Detail & Related papers (2020-03-09T08:16:16Z) - EPINE: Enhanced Proximity Information Network Embedding [2.257737378757467]
In this work, we devote to mining valuable information in adjacency matrices at a deeper level.
Under the same objective, many NE methods calculate high-order proximity by the powers of adjacency matrices.
We propose to redefine high-order proximity in a more intuitive manner.
arXiv Detail & Related papers (2020-03-04T15:57:17Z) - Neural Network Tomography [4.407668482702675]
Network tomography is a classic research problem in the realm of network monitoring.
NeuTomography utilizes deep neural network and data augmentation to predict the unmeasured performance metrics.
NeuTomography can be employed to reconstruct the original network topology.
arXiv Detail & Related papers (2020-01-09T12:19:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.