Barlow Graph Auto-Encoder for Unsupervised Network Embedding
- URL: http://arxiv.org/abs/2110.15742v1
- Date: Fri, 29 Oct 2021 12:30:43 GMT
- Title: Barlow Graph Auto-Encoder for Unsupervised Network Embedding
- Authors: Rayyan Ahmad Khan, Martin Kleinsteuber
- Abstract summary: We propose Barlow Graph Auto-Encoder, a simple yet effective architecture for learning network embedding.
It aims to maximize the similarity between the embedding vectors of immediate and larger neighborhoods of a node, while minimizing the redundancy between the components of these projections.
Our approach yields promising results for inductive link prediction and is also on par with state of the art for clustering and downstream node classification.
- Score: 6.900303913555705
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Network embedding has emerged as a promising research field for network
analysis. Recently, an approach, named Barlow Twins, has been proposed for
self-supervised learning in computer vision by applying the
redundancy-reduction principle to the embedding vectors corresponding to two
distorted versions of the image samples. Motivated by this, we propose Barlow
Graph Auto-Encoder, a simple yet effective architecture for learning network
embedding. It aims to maximize the similarity between the embedding vectors of
immediate and larger neighborhoods of a node, while minimizing the redundancy
between the components of these projections. In addition, we also present the
variation counterpart named as Barlow Variational Graph Auto-Encoder. Our
approach yields promising results for inductive link prediction and is also on
par with state of the art for clustering and downstream node classification, as
demonstrated by extensive comparisons with several well-known techniques on
three benchmark citation datasets.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - EGRC-Net: Embedding-induced Graph Refinement Clustering Network [66.44293190793294]
We propose a novel graph clustering network called Embedding-Induced Graph Refinement Clustering Network (EGRC-Net)
EGRC-Net effectively utilizes the learned embedding to adaptively refine the initial graph and enhance the clustering performance.
Our proposed methods consistently outperform several state-of-the-art approaches.
arXiv Detail & Related papers (2022-11-19T09:08:43Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Directed Graph Auto-Encoders [3.2873782624127843]
We introduce a new class of auto-encoders for directed graphs motivated by a direct extension of the Weisfeiler-Leman algorithm to pairs of node labels.
We demonstrate the ability of the proposed model to learn meaningful latent embeddings and achieve superior performance on the directed link prediction task.
arXiv Detail & Related papers (2022-02-25T01:19:47Z) - Self-Supervised Graph Learning with Proximity-based Views and Channel
Contrast [4.761137180081091]
Graph neural networks (GNNs) use neighborhood aggregation as a core component that results in feature smoothing among nodes in proximity.
To tackle this problem, we strengthen the graph with two additional graph views, in which nodes are directly linked to those with the most similar features or local structures.
We propose a method that aims to maximize the agreement between representations across generated views and the original graph.
arXiv Detail & Related papers (2021-06-07T15:38:36Z) - Barlow Twins: Self-Supervised Learning via Redundancy Reduction [31.077182488826963]
Self-supervised learning (SSL) is rapidly closing the gap with supervised methods on large computer vision benchmarks.
We propose an objective function that naturally avoids collapse by measuring the cross-correlation matrix between the outputs of two identical networks.
This causes the representation vectors of distorted versions of a sample to be similar, while minimizing the redundancy between the components of these vectors.
arXiv Detail & Related papers (2021-03-04T18:55:09Z) - PC-RGNN: Point Cloud Completion and Graph Neural Network for 3D Object
Detection [57.49788100647103]
LiDAR-based 3D object detection is an important task for autonomous driving.
Current approaches suffer from sparse and partial point clouds of distant and occluded objects.
In this paper, we propose a novel two-stage approach, namely PC-RGNN, dealing with such challenges by two specific solutions.
arXiv Detail & Related papers (2020-12-18T18:06:43Z) - Graph Fairing Convolutional Networks for Anomaly Detection [7.070726553564701]
We introduce a graph convolutional network with skip connections for semi-supervised anomaly detection.
The effectiveness of our model is demonstrated through extensive experiments on five benchmark datasets.
arXiv Detail & Related papers (2020-10-20T13:45:47Z) - GraphCL: Contrastive Self-Supervised Learning of Graph Representations [20.439666392958284]
We propose Graph Contrastive Learning (GraphCL), a general framework for learning node representations in a self supervised manner.
We use graph neural networks to produce two representations of the same node and leverage a contrastive learning loss to maximize agreement between them.
In both transductive and inductive learning setups, we demonstrate that our approach significantly outperforms the state-of-the-art in unsupervised learning on a number of node classification benchmarks.
arXiv Detail & Related papers (2020-07-15T22:36:53Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.