AmGCL: Feature Imputation of Attribute Missing Graph via Self-supervised
Contrastive Learning
- URL: http://arxiv.org/abs/2305.03741v1
- Date: Fri, 5 May 2023 07:03:24 GMT
- Title: AmGCL: Feature Imputation of Attribute Missing Graph via Self-supervised
Contrastive Learning
- Authors: Xiaochuan Zhang, Mengran Li, Ye Wang, Haojun Fei
- Abstract summary: Attribute missing Graph Contrastive Learning (AmGCL) is a framework for handling missing node attributes in attribute graph data.
Our experimental results on multiple real-world datasets demonstrate that AmGCL outperforms state-of-the-art methods in both feature imputation and node classification tasks.
- Score: 2.42435538337438
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Attribute graphs are ubiquitous in multimedia applications, and graph
representation learning (GRL) has been successful in analyzing attribute graph
data. However, incomplete graph data and missing node attributes can have a
negative impact on media knowledge discovery. Existing methods for handling
attribute missing graph have limited assumptions or fail to capture complex
attribute-graph dependencies. To address these challenges, we propose Attribute
missing Graph Contrastive Learning (AmGCL), a framework for handling missing
node attributes in attribute graph data. AmGCL leverages Dirichlet energy
minimization-based feature precoding to encode in missing attributes and a
self-supervised Graph Augmentation Contrastive Learning Structure (GACLS) to
learn latent variables from the encoded-in data. Specifically, AmGCL utilizies
feature reconstruction based on structure-attribute energy minimization while
maximizes the lower bound of evidence for latent representation mutual
information. Our experimental results on multiple real-world datasets
demonstrate that AmGCL outperforms state-of-the-art methods in both feature
imputation and node classification tasks, indicating the effectiveness of our
proposed method in real-world attribute graph analysis tasks.
Related papers
- GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Redundancy-Free Self-Supervised Relational Learning for Graph Clustering [13.176413653235311]
We propose a novel self-supervised deep graph clustering method named Redundancy-Free Graph Clustering (R$2$FGC)
It extracts the attribute- and structure-level relational information from both global and local views based on an autoencoder and a graph autoencoder.
Our experiments are performed on widely used benchmark datasets to validate the superiority of our R$2$FGC over state-of-the-art baselines.
arXiv Detail & Related papers (2023-09-09T06:18:50Z) - Feature propagation as self-supervision signals on graphs [0.0]
Regularized Graph Infomax (RGI) is a simple yet effective framework for node level self-supervised learning.
We show that RGI can achieve state-of-the-art performance regardless of its simplicity.
arXiv Detail & Related papers (2023-03-15T14:20:06Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - Features Based Adaptive Augmentation for Graph Contrastive Learning [0.0]
Self-Supervised learning aims to eliminate the need for expensive annotation in graph representation learning.
We introduce a Feature Based Adaptive Augmentation (FebAA) approach, which identifies and preserves potentially influential features.
We successfully improved the accuracy of GRACE and BGRL on eight graph representation learning's benchmark datasets.
arXiv Detail & Related papers (2022-07-05T03:41:20Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Wasserstein diffusion on graphs with missing attributes [38.153052525001264]
We propose an innovative node representation learning framework, Wasserstein graph diffusion (WGD), to mitigate the problem.
Instead of feature imputation, our method directly learns node representations from the missing-attribute graphs.
arXiv Detail & Related papers (2021-02-06T00:06:51Z) - Learning on Attribute-Missing Graphs [66.76561524848304]
There is a graph where attributes of only partial nodes could be available and those of the others might be entirely missing.
Existing graph learning methods including the popular GNN cannot provide satisfied learning performance.
We develop a novel distribution matching based GNN called structure-attribute transformer (SAT) for attribute-missing graphs.
arXiv Detail & Related papers (2020-11-03T11:09:52Z) - Unsupervised Graph Embedding via Adaptive Graph Learning [85.28555417981063]
Graph autoencoders (GAEs) are powerful tools in representation learning for graph embedding.
In this paper, two novel unsupervised graph embedding methods, unsupervised graph embedding via adaptive graph learning (BAGE) and unsupervised graph embedding via variational adaptive graph learning (VBAGE) are proposed.
Experimental studies on several datasets validate our design and demonstrate that our methods outperform baselines by a wide margin in node clustering, node classification, and graph visualization tasks.
arXiv Detail & Related papers (2020-03-10T02:33:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.