Adaptive Graph Encoder for Attributed Graph Embedding
- URL: http://arxiv.org/abs/2007.01594v1
- Date: Fri, 3 Jul 2020 10:20:34 GMT
- Title: Adaptive Graph Encoder for Attributed Graph Embedding
- Authors: Ganqu Cui, Jie Zhou, Cheng Yang, Zhiyuan Liu
- Abstract summary: Attributed graph embedding learns vector representations from graph topology and node features.
We propose Adaptive Graph (AGE), a novel attributed graph embedding framework.
We conduct experiments using four public benchmark datasets to validate AGE on node clustering and link prediction tasks.
- Score: 36.06427854846497
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Attributed graph embedding, which learns vector representations from graph
topology and node features, is a challenging task for graph analysis. Recently,
methods based on graph convolutional networks (GCNs) have made great progress
on this task. However,existing GCN-based methods have three major drawbacks.
Firstly,our experiments indicate that the entanglement of graph convolutional
filters and weight matrices will harm both the performance and robustness.
Secondly, we show that graph convolutional filters in these methods reveal to
be special cases of generalized Laplacian smoothing filters, but they do not
preserve optimal low-pass characteristics. Finally, the training objectives of
existing algorithms are usually recovering the adjacency matrix or feature
matrix, which are not always consistent with real-world applications. To
address these issues, we propose Adaptive Graph Encoder (AGE), a novel
attributed graph embedding framework. AGE consists of two modules: (1) To
better alleviate the high-frequency noises in the node features, AGE first
applies a carefully-designed Laplacian smoothing filter. (2) AGE employs an
adaptive encoder that iteratively strengthens the filtered features for better
node embeddings. We conduct experiments using four public benchmark datasets to
validate AGE on node clustering and link prediction tasks. Experimental results
show that AGE consistently outperforms state-of-the-art graph embedding methods
considerably on these tasks.
Related papers
- Amplify Graph Learning for Recommendation via Sparsity Completion [16.32861024767423]
Graph learning models have been widely deployed in collaborative filtering (CF) based recommendation systems.
Due to the issue of data sparsity, the graph structure of the original input lacks potential positive preference edges.
We propose an Amplify Graph Learning framework based on Sparsity Completion (called AGL-SC)
arXiv Detail & Related papers (2024-06-27T08:26:20Z) - Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - ADA-GAD: Anomaly-Denoised Autoencoders for Graph Anomaly Detection [84.0718034981805]
We introduce a novel framework called Anomaly-Denoised Autoencoders for Graph Anomaly Detection (ADA-GAD)
In the first stage, we design a learning-free anomaly-denoised augmentation method to generate graphs with reduced anomaly levels.
In the next stage, the decoders are retrained for detection on the original graph.
arXiv Detail & Related papers (2023-12-22T09:02:01Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - Adaptive Kernel Graph Neural Network [21.863238974404474]
Graph neural networks (GNNs) have demonstrated great success in representation learning for graph-structured data.
In this paper, we propose a novel framework - i.e., namely Adaptive Kernel Graph Neural Network (AKGNN)
AKGNN learns to adapt to the optimal graph kernel in a unified manner at the first attempt.
Experiments are conducted on acknowledged benchmark datasets and promising results demonstrate the outstanding performance of our proposed AKGNN.
arXiv Detail & Related papers (2021-12-08T20:23:58Z) - Graph Denoising with Framelet Regularizer [25.542429117462547]
This paper tailors regularizers for graph data in terms of both feature and structure noises.
Our model achieves significantly better performance compared with popular graph convolutions even when the graph is heavily contaminated.
arXiv Detail & Related papers (2021-11-05T05:17:23Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Pseudoinverse Graph Convolutional Networks: Fast Filters Tailored for
Large Eigengaps of Dense Graphs and Hypergraphs [0.0]
Graph Convolutional Networks (GCNs) have proven to be successful tools for semi-supervised classification on graph-based datasets.
We propose a new GCN variant whose three-part filter space is targeted at dense graphs.
arXiv Detail & Related papers (2020-08-03T08:48:41Z) - Heuristic Semi-Supervised Learning for Graph Generation Inspired by
Electoral College [80.67842220664231]
We propose a novel pre-processing technique, namely ELectoral COllege (ELCO), which automatically expands new nodes and edges to refine the label similarity within a dense subgraph.
In all setups tested, our method boosts the average score of base models by a large margin of 4.7 points, as well as consistently outperforms the state-of-the-art.
arXiv Detail & Related papers (2020-06-10T14:48:48Z) - Unsupervised Graph Embedding via Adaptive Graph Learning [85.28555417981063]
Graph autoencoders (GAEs) are powerful tools in representation learning for graph embedding.
In this paper, two novel unsupervised graph embedding methods, unsupervised graph embedding via adaptive graph learning (BAGE) and unsupervised graph embedding via variational adaptive graph learning (VBAGE) are proposed.
Experimental studies on several datasets validate our design and demonstrate that our methods outperform baselines by a wide margin in node clustering, node classification, and graph visualization tasks.
arXiv Detail & Related papers (2020-03-10T02:33:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.