A Brief Survey on Representation Learning based Graph Dimensionality
Reduction Techniques
- URL: http://arxiv.org/abs/2211.05594v1
- Date: Thu, 13 Oct 2022 04:29:24 GMT
- Title: A Brief Survey on Representation Learning based Graph Dimensionality
Reduction Techniques
- Authors: Akhil Pandey Akella
- Abstract summary: Dimensionality reduction techniques map data represented on higher dimensions onto lower dimensions with varying degrees of information loss.
There exist several techniques that are efficient at generating embeddings from graph data and projecting them onto low dimensional latent spaces.
We present this survey to outline the benefits as well as problems associated with the existing graph dimensionality reduction techniques.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Dimensionality reduction techniques map data represented on higher dimensions
onto lower dimensions with varying degrees of information loss. Graph
dimensionality reduction techniques adopt the same principle of providing
latent representations of the graph structure with minor adaptations to the
output representations along with the input data. There exist several cutting
edge techniques that are efficient at generating embeddings from graph data and
projecting them onto low dimensional latent spaces. Due to variations in the
operational philosophy, the benefits of a particular graph dimensionality
reduction technique might not prove advantageous to every scenario or rather
every dataset. As a result, some techniques are efficient at representing the
relationship between nodes at lower dimensions, while others are good at
encapsulating the entire graph structure on low dimensional space. We present
this survey to outline the benefits as well as problems associated with the
existing graph dimensionality reduction techniques. We also attempted to
connect the dots regarding the potential improvements to some of the
techniques. This survey could be helpful for upcoming researchers interested in
exploring the usage of graph representation learning to effectively produce
low-dimensional graph embeddings with varying degrees of granularity.
Related papers
- A Comprehensive Survey on Graph Reduction: Sparsification, Coarsening, and Condensation [21.76051896779245]
We aim to provide a comprehensive understanding of graph reduction methods, including graph sparsification, graph coarsening, and graph condensation.
Our survey then systematically reviews the technical details of these methods and emphasizes their practical applications across diverse scenarios.
arXiv Detail & Related papers (2024-01-29T01:19:09Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Exploring Graph Classification Techniques Under Low Data Constraints: A
Comprehensive Study [0.0]
It covers various techniques for graph data augmentation, including node and edge perturbation, graph coarsening, and graph generation.
The paper explores these areas in depth and delves into further sub classifications.
It provides an extensive array of techniques that can be employed in solving graph processing problems faced in low-data scenarios.
arXiv Detail & Related papers (2023-11-21T17:23:05Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Unbiased Graph Embedding with Biased Graph Observations [52.82841737832561]
We propose a principled new way for obtaining unbiased representations by learning from an underlying bias-free graph.
Based on this new perspective, we propose two complementary methods for uncovering such an underlying graph.
arXiv Detail & Related papers (2021-10-26T18:44:37Z) - Hyperparameter-free and Explainable Whole Graph Embedding [16.03671347701557]
Graph representation learning attempts to learn a lower-dimensional representation vector for each node or the whole graph.
This paper proposes a new whole graph embedding method, combining the DHC (Degree, H-index and Coreness) theorem and Shannon Entropy (E)
The proposed approach has a good performance in lower-dimensional graph visualization.
arXiv Detail & Related papers (2021-08-04T15:30:52Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Understanding Coarsening for Embedding Large-Scale Graphs [3.6739949215165164]
Proper analysis of graphs with Machine Learning (ML) algorithms has the potential to yield far-reaching insights into many areas of research and industry.
The irregular structure of graph data constitutes an obstacle for running ML tasks on graphs.
We analyze the impact of the coarsening quality on the embedding performance both in terms of speed and accuracy.
arXiv Detail & Related papers (2020-09-10T15:06:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.