Local Intrinsic Dimensionality for Dynamic Graph Embeddings
- URL: http://arxiv.org/abs/2411.16145v1
- Date: Mon, 25 Nov 2024 07:11:45 GMT
- Title: Local Intrinsic Dimensionality for Dynamic Graph Embeddings
- Authors: Dušica Knežević, Miloš Savić, Miloš Radovanović,
- Abstract summary: Local intrinsic dimensionality (LID) has important theoretical implications and practical applications in the fields of data mining and machine learning.
Recent research efforts indicate that LID measures defined for graphs can improve graph representational learning methods based on random walks.
In this paper, we discuss how NC-LID, a LID measure designed for static graphs, can be adapted for dynamic networks.
- Score: 0.0
- License:
- Abstract: The notion of local intrinsic dimensionality (LID) has important theoretical implications and practical applications in the fields of data mining and machine learning. Recent research efforts indicate that LID measures defined for graphs can improve graph representational learning methods based on random walks. In this paper, we discuss how NC-LID, a LID measure designed for static graphs, can be adapted for dynamic networks. Focusing on dynnode2vec as the most representative dynamic graph embedding method based on random walks, we examine correlations between NC-LID and the intrinsic quality of 10 real-world dynamic network embeddings. The obtained results show that NC-LID can be used as a good indicator of nodes whose embedding vectors do not tend to preserve temporal graph structure well. Thus, our empirical findings constitute the first step towards LID-aware dynamic graph embedding methods.
Related papers
- Expressivity of Representation Learning on Continuous-Time Dynamic Graphs: An Information-Flow Centric Review [2.310679096120274]
This paper provides a comprehensive review of Graph Representation Learning (GRL) on Continuous-Time Dynamic Graph (CTDG) models.
We introduce a novel theoretical framework that analyzes the expressivity of CTDG models through an Information-Flow (IF) lens.
arXiv Detail & Related papers (2024-12-05T00:12:50Z) - Information propagation dynamics in Deep Graph Networks [1.8130068086063336]
Deep Graph Networks (DGNs) have emerged as a family of deep learning models that can process and learn structured information.
This thesis investigates the dynamics of information propagation within DGNs for static and dynamic graphs, focusing on their design as dynamical systems.
arXiv Detail & Related papers (2024-10-14T12:55:51Z) - When Graph Neural Networks Meet Dynamic Mode Decomposition [34.16727363891593]
We introduce a family of DMD-GNN models that effectively leverage the low-rank eigenfunctions provided by the DMD algorithm.
Our work paves the path for applying advanced dynamical system analysis tools via GNNs.
arXiv Detail & Related papers (2024-10-08T01:09:48Z) - Dynamic Neural Dowker Network: Approximating Persistent Homology in Dynamic Directed Graphs [11.646514065979323]
This paper introduces the Dynamic Neural Dowker Network (DNDN), a novel framework specifically designed to approximate the results of dynamic Dowker filtration.
Our approach is validated through comprehensive experiments on real-world datasets.
arXiv Detail & Related papers (2024-08-17T07:13:12Z) - Joint Feature and Differentiable $ k $-NN Graph Learning using Dirichlet
Energy [103.74640329539389]
We propose a deep FS method that simultaneously conducts feature selection and differentiable $ k $-NN graph learning.
We employ Optimal Transport theory to address the non-differentiability issue of learning $ k $-NN graphs in neural networks.
We validate the effectiveness of our model with extensive experiments on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-21T08:15:55Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - Local Intrinsic Dimensionality Measures for Graphs, with Applications to
Graph Embeddings [1.1602089225841632]
We propose NC-LID, a novel LID-related measure for quantifying the discriminatory power of the shortest-path distance with respect to natural communities of nodes as their intrinsic localities.
It is shown how this measure can be used to design LID-aware graph embedding algorithms by formulating two LID-elastic variants of node2vec.
Our empirical analysis of NC-LID on a large number of real-world graphs shows that this measure is able to point to nodes with high link reconstruction errors in node2vec embeddings better than node centrality metrics.
arXiv Detail & Related papers (2022-08-25T10:32:07Z) - On the spatial attention in Spatio-Temporal Graph Convolutional Networks
for skeleton-based human action recognition [97.14064057840089]
Graphal networks (GCNs) promising performance in skeleton-based human action recognition by modeling a sequence of skeletons as a graph.
Most of the recently proposed G-temporal-based methods improve the performance by learning the graph structure at each layer of the network.
arXiv Detail & Related papers (2020-11-07T19:03:04Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.