Local Intrinsic Dimensionality Measures for Graphs, with Applications to
Graph Embeddings
- URL: http://arxiv.org/abs/2208.11986v2
- Date: Thu, 13 Jul 2023 12:40:11 GMT
- Title: Local Intrinsic Dimensionality Measures for Graphs, with Applications to
Graph Embeddings
- Authors: Milo\v{s} Savi\'c, Vladimir Kurbalija, Milo\v{s} Radovanovi\'c
- Abstract summary: We propose NC-LID, a novel LID-related measure for quantifying the discriminatory power of the shortest-path distance with respect to natural communities of nodes as their intrinsic localities.
It is shown how this measure can be used to design LID-aware graph embedding algorithms by formulating two LID-elastic variants of node2vec.
Our empirical analysis of NC-LID on a large number of real-world graphs shows that this measure is able to point to nodes with high link reconstruction errors in node2vec embeddings better than node centrality metrics.
- Score: 1.1602089225841632
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The notion of local intrinsic dimensionality (LID) is an important
advancement in data dimensionality analysis, with applications in data mining,
machine learning and similarity search problems. Existing distance-based LID
estimators were designed for tabular datasets encompassing data points
represented as vectors in a Euclidean space. After discussing their limitations
for graph-structured data considering graph embeddings and graph distances, we
propose NC-LID, a novel LID-related measure for quantifying the discriminatory
power of the shortest-path distance with respect to natural communities of
nodes as their intrinsic localities. It is shown how this measure can be used
to design LID-aware graph embedding algorithms by formulating two LID-elastic
variants of node2vec with personalized hyperparameters that are adjusted
according to NC-LID values. Our empirical analysis of NC-LID on a large number
of real-world graphs shows that this measure is able to point to nodes with
high link reconstruction errors in node2vec embeddings better than node
centrality metrics. The experimental evaluation also shows that the proposed
LID-elastic node2vec extensions improve node2vec by better preserving graph
structure in generated embeddings.
Related papers
- Geodesic Distance Between Graphs: A Spectral Metric for Assessing the Stability of Graph Neural Networks [4.110108749051657]
We introduce a Graph Geodesic Distance (GGD) metric for assessing the generalization and stability of Graph Neural Networks (GNNs)
We show that the proposed GGD metric can effectively quantify dissimilarities between two graphs by encapsulating their differences in key structural (spectral) properties.
The proposed GGD metric shows significantly improved performance for stability evaluation of GNNs especially when only partial node features are available.
arXiv Detail & Related papers (2024-06-15T04:47:40Z) - Shape-Graph Matching Network (SGM-net): Registration for Statistical
Shape Analysis [20.58923754314197]
This paper focuses on the statistical analysis of shapes of data objects called shape graphs.
A critical need here is a constrained registration of points (nodes to nodes, edges to edges) across objects.
This paper tackles this registration problem using a novel neural-network architecture.
arXiv Detail & Related papers (2023-08-14T00:42:03Z) - BOURNE: Bootstrapped Self-supervised Learning Framework for Unified
Graph Anomaly Detection [50.26074811655596]
We propose a novel unified graph anomaly detection framework based on bootstrapped self-supervised learning (named BOURNE)
By swapping the context embeddings between nodes and edges, we enable the mutual detection of node and edge anomalies.
BOURNE can eliminate the need for negative sampling, thereby enhancing its efficiency in handling large graphs.
arXiv Detail & Related papers (2023-07-28T00:44:57Z) - Tree Mover's Distance: Bridging Graph Metrics and Stability of Graph
Neural Networks [54.225220638606814]
We propose a pseudometric for attributed graphs, the Tree Mover's Distance (TMD), and study its relation to generalization.
First, we show that TMD captures properties relevant to graph classification; a simple TMD-SVM performs competitively with standard GNNs.
Second, we relate TMD to generalization of GNNs under distribution shifts, and show that it correlates well with performance drop under such shifts.
arXiv Detail & Related papers (2022-10-04T21:03:52Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - On a linear fused Gromov-Wasserstein distance for graph structured data [2.360534864805446]
We propose a novel distance between two graphs, named linearFGW, defined as the Euclidean distance between their embeddings.
The advantages of the proposed distance are twofold: 1) it can take into account node feature and structure of graphs for measuring the similarity between graphs in a kernel-based framework, 2) it can be much faster for computing kernel matrix than pairwise OT-based distances.
arXiv Detail & Related papers (2022-03-09T13:43:18Z) - SLGCN: Structure Learning Graph Convolutional Networks for Graphs under
Heterophily [5.619890178124606]
We propose a structure learning graph convolutional networks (SLGCNs) to alleviate the issue from two aspects.
Specifically, we design a efficient-spectral-clustering with anchors (ESC-ANCH) approach to efficiently aggregate feature representations from all similar nodes.
Experimental results on a wide range of benchmark datasets illustrate that the proposed SLGCNs outperform the stat-of-the-art GNN counterparts.
arXiv Detail & Related papers (2021-05-28T13:00:38Z) - node2coords: Graph Representation Learning with Wasserstein Barycenters [59.07120857271367]
We introduce node2coords, a representation learning algorithm for graphs.
It learns simultaneously a low-dimensional space and coordinates for the nodes in that space.
Experimental results demonstrate that the representations learned with node2coords are interpretable.
arXiv Detail & Related papers (2020-07-31T13:14:25Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.