EGRC-Net: Embedding-induced Graph Refinement Clustering Network
- URL: http://arxiv.org/abs/2211.10627v2
- Date: Tue, 14 Nov 2023 06:00:04 GMT
- Title: EGRC-Net: Embedding-induced Graph Refinement Clustering Network
- Authors: Zhihao Peng, Hui Liu, Yuheng Jia, Junhui Hou
- Abstract summary: We propose a novel graph clustering network called Embedding-Induced Graph Refinement Clustering Network (EGRC-Net)
EGRC-Net effectively utilizes the learned embedding to adaptively refine the initial graph and enhance the clustering performance.
Our proposed methods consistently outperform several state-of-the-art approaches.
- Score: 66.44293190793294
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Existing graph clustering networks heavily rely on a predefined yet fixed
graph, which can lead to failures when the initial graph fails to accurately
capture the data topology structure of the embedding space. In order to address
this issue, we propose a novel clustering network called Embedding-Induced
Graph Refinement Clustering Network (EGRC-Net), which effectively utilizes the
learned embedding to adaptively refine the initial graph and enhance the
clustering performance. To begin, we leverage both semantic and topological
information by employing a vanilla auto-encoder and a graph convolution
network, respectively, to learn a latent feature representation. Subsequently,
we utilize the local geometric structure within the feature embedding space to
construct an adjacency matrix for the graph. This adjacency matrix is
dynamically fused with the initial one using our proposed fusion architecture.
To train the network in an unsupervised manner, we minimize the Jeffreys
divergence between multiple derived distributions. Additionally, we introduce
an improved approximate personalized propagation of neural predictions to
replace the standard graph convolution network, enabling EGRC-Net to scale
effectively. Through extensive experiments conducted on nine widely-used
benchmark datasets, we demonstrate that our proposed methods consistently
outperform several state-of-the-art approaches. Notably, EGRC-Net achieves an
improvement of more than 11.99\% in Adjusted Rand Index (ARI) over the best
baseline on the DBLP dataset. Furthermore, our scalable approach exhibits a
10.73% gain in ARI while reducing memory usage by 33.73% and decreasing running
time by 19.71%. The code for EGRC-Net will be made publicly available at
\url{https://github.com/ZhihaoPENG-CityU/EGRC-Net}.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Efficient Heterogeneous Graph Learning via Random Projection [58.4138636866903]
Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for deep learning on heterogeneous graphs.
Recent pre-computation-based HGNNs use one-time message passing to transform a heterogeneous graph into regular-shaped tensors.
We propose a hybrid pre-computation-based HGNN, named Random Projection Heterogeneous Graph Neural Network (RpHGNN)
arXiv Detail & Related papers (2023-10-23T01:25:44Z) - GCondNet: A Novel Method for Improving Neural Networks on Small High-Dimensional Tabular Data [14.124731264553889]
We propose GCondNet to enhance neural networks by leveraging implicit structures present in data.
GCondNet exploits the data's high-dimensionality, and thus improves the performance of an underlying predictor network.
We demonstrate GCondNet's effectiveness on 12 real-world datasets, where it outperforms 14 standard and state-of-the-art methods.
arXiv Detail & Related papers (2022-11-11T16:13:34Z) - SCGC : Self-Supervised Contrastive Graph Clustering [1.1470070927586016]
Graph clustering discovers groups or communities within networks.
Deep learning methods such as autoencoders cannot incorporate rich structural information.
We propose Self-Supervised Contrastive Graph Clustering (SCGC)
arXiv Detail & Related papers (2022-04-27T01:38:46Z) - SLGCN: Structure Learning Graph Convolutional Networks for Graphs under
Heterophily [5.619890178124606]
We propose a structure learning graph convolutional networks (SLGCNs) to alleviate the issue from two aspects.
Specifically, we design a efficient-spectral-clustering with anchors (ESC-ANCH) approach to efficiently aggregate feature representations from all similar nodes.
Experimental results on a wide range of benchmark datasets illustrate that the proposed SLGCNs outperform the stat-of-the-art GNN counterparts.
arXiv Detail & Related papers (2021-05-28T13:00:38Z) - Dynamic Graph: Learning Instance-aware Connectivity for Neural Networks [78.65792427542672]
Dynamic Graph Network (DG-Net) is a complete directed acyclic graph, where the nodes represent convolutional blocks and the edges represent connection paths.
Instead of using the same path of the network, DG-Net aggregates features dynamically in each node, which allows the network to have more representation ability.
arXiv Detail & Related papers (2020-10-02T16:50:26Z) - Representation Learning of Graphs Using Graph Convolutional Multilayer
Networks Based on Motifs [17.823543937167848]
mGCMN is a novel framework which utilizes node feature information and the higher order local structure of the graph.
It will greatly improve the learning efficiency of the graph neural network and promote a brand-new learning mode establishment.
arXiv Detail & Related papers (2020-07-31T04:18:20Z) - Self-Constructing Graph Convolutional Networks for Semantic Labeling [23.623276007011373]
We propose a novel architecture called the Self-Constructing Graph (SCG), which makes use of learnable latent variables to generate embeddings.
SCG can automatically obtain optimized non-local context graphs from complex-shaped objects in aerial imagery.
We demonstrate the effectiveness and flexibility of the proposed SCG on the publicly available ISPRS Vaihingen dataset.
arXiv Detail & Related papers (2020-03-15T21:55:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.