Node Embeddings and Exact Low-Rank Representations of Complex Networks
- URL: http://arxiv.org/abs/2006.05592v2
- Date: Fri, 16 Oct 2020 05:38:21 GMT
- Title: Node Embeddings and Exact Low-Rank Representations of Complex Networks
- Authors: Sudhanshu Chanpuriya, Cameron Musco, Konstantinos Sotiropoulos,
Charalampos E. Tsourakakis
- Abstract summary: Recent work by Seshadhri et al. suggests that such embeddings cannot capture local structure arising in complex networks.
We show that the results of Seshadhri et al. are intimately connected to the model they use rather than the low-dimensional structure of complex networks.
- Score: 30.869784223109832
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Low-dimensional embeddings, from classical spectral embeddings to modern
neural-net-inspired methods, are a cornerstone in the modeling and analysis of
complex networks. Recent work by Seshadhri et al. (PNAS 2020) suggests that
such embeddings cannot capture local structure arising in complex networks. In
particular, they show that any network generated from a natural low-dimensional
model cannot be both sparse and have high triangle density (high clustering
coefficient), two hallmark properties of many real-world networks.
In this work we show that the results of Seshadhri et al. are intimately
connected to the model they use rather than the low-dimensional structure of
complex networks. Specifically, we prove that a minor relaxation of their model
can generate sparse graphs with high triangle density. Surprisingly, we show
that this same model leads to exact low-dimensional factorizations of many
real-world networks. We give a simple algorithm based on logistic principal
component analysis (LPCA) that succeeds in finding such exact embeddings.
Finally, we perform a large number of experiments that verify the ability of
very low-dimensional embeddings to capture local structure in real-world
networks.
Related papers
- Defining Neural Network Architecture through Polytope Structures of Dataset [53.512432492636236]
This paper defines upper and lower bounds for neural network widths, which are informed by the polytope structure of the dataset in question.
We develop an algorithm to investigate a converse situation where the polytope structure of a dataset can be inferred from its corresponding trained neural networks.
It is established that popular datasets such as MNIST, Fashion-MNIST, and CIFAR10 can be efficiently encapsulated using no more than two polytopes with a small number of faces.
arXiv Detail & Related papers (2024-02-04T08:57:42Z) - On Characterizing the Evolution of Embedding Space of Neural Networks
using Algebraic Topology [9.537910170141467]
We study how the topology of feature embedding space changes as it passes through the layers of a well-trained deep neural network (DNN) through Betti numbers.
We demonstrate that as depth increases, a topologically complicated dataset is transformed into a simple one, resulting in Betti numbers attaining their lowest possible value.
arXiv Detail & Related papers (2023-11-08T10:45:12Z) - Learning multi-scale local conditional probability models of images [7.07848787073901]
Deep neural networks can learn powerful prior probability models for images, as evidenced by the high-quality generations obtained with recent score-based diffusion methods.
But the means by which these networks capture complex global statistical structure, apparently without suffering from the curse of dimensionality, remain a mystery.
We incorporate diffusion methods into a multi-scale decomposition, reducing dimensionality by assuming a stationary local Markov model for wavelet coefficients conditioned on coarser-scale coefficients.
arXiv Detail & Related papers (2023-03-06T09:23:14Z) - Flattening-Net: Deep Regular 2D Representation for 3D Point Cloud
Analysis [66.49788145564004]
We present an unsupervised deep neural architecture called Flattening-Net to represent irregular 3D point clouds of arbitrary geometry and topology.
Our methods perform favorably against the current state-of-the-art competitors.
arXiv Detail & Related papers (2022-12-17T15:05:25Z) - Low-Rank Representations Towards Classification Problem of Complex
Networks [0.0]
Complex networks representing social interactions, brain activities, molecular structures have been studied widely to be able to understand and predict their characteristics as graphs.
Models and algorithms for these networks are used in real-life applications, such as search engines, and recommender systems.
We study the performance of such low-rank representations of real-life networks on a network classification problem.
arXiv Detail & Related papers (2022-10-20T19:56:18Z) - Dive into Layers: Neural Network Capacity Bounding using Algebraic
Geometry [55.57953219617467]
We show that the learnability of a neural network is directly related to its size.
We use Betti numbers to measure the topological geometric complexity of input data and the neural network.
We perform the experiments on a real-world dataset MNIST and the results verify our analysis and conclusion.
arXiv Detail & Related papers (2021-09-03T11:45:51Z) - Neural Geometric Level of Detail: Real-time Rendering with Implicit 3D
Shapes [77.6741486264257]
We introduce an efficient neural representation that, for the first time, enables real-time rendering of high-fidelity neural SDFs.
We show that our representation is 2-3 orders of magnitude more efficient in terms of rendering speed compared to previous works.
arXiv Detail & Related papers (2021-01-26T18:50:22Z) - The impossibility of low rank representations for triangle-rich complex
networks [9.550745725703292]
We argue that such graph embeddings do notcapture salient properties of complex networks.
We mathematically prove that any embedding that can successfully create these two properties must have rank nearly linear in the number of vertices.
Among other implications, this establishes that popular embedding techniques such as Singular Value Decomposition and node2vec fail to capture significant structural aspects of real-world complex networks.
arXiv Detail & Related papers (2020-03-27T20:57:56Z) - Convolutional Occupancy Networks [88.48287716452002]
We propose Convolutional Occupancy Networks, a more flexible implicit representation for detailed reconstruction of objects and 3D scenes.
By combining convolutional encoders with implicit occupancy decoders, our model incorporates inductive biases, enabling structured reasoning in 3D space.
We empirically find that our method enables the fine-grained implicit 3D reconstruction of single objects, scales to large indoor scenes, and generalizes well from synthetic to real data.
arXiv Detail & Related papers (2020-03-10T10:17:07Z) - Latent Poisson models for networks with heterogeneous density [0.0]
Empirical networks are often globally sparse, with a small average number of connections per node, when compared to the total size of the network.
We show how latent Poisson models which generate hidden multigraphs can be effective at capturing this density, while being more tractable mathematically than some of the alternatives that model simple graphs directly.
arXiv Detail & Related papers (2020-02-18T18:58:13Z) - Seismic horizon detection with neural networks [62.997667081978825]
This paper is an open-sourced research of applying binary segmentation approach to the task of horizon detection on multiple real seismic cubes with a focus on inter-cube generalization of the predictive model.
The main contribution of this paper is an open-sourced research of applying binary segmentation approach to the task of horizon detection on multiple real seismic cubes with a focus on inter-cube generalization of the predictive model.
arXiv Detail & Related papers (2020-01-10T11:30:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.