Graph InfoClust: Leveraging cluster-level node information for
unsupervised graph representation learning
- URL: http://arxiv.org/abs/2009.06946v1
- Date: Tue, 15 Sep 2020 09:33:20 GMT
- Title: Graph InfoClust: Leveraging cluster-level node information for
unsupervised graph representation learning
- Authors: Costas Mavromatis, George Karypis
- Abstract summary: We propose a graph representation learning method called Graph InfoClust.
It seeks to additionally capture cluster-level information content.
This optimization leads the node representations to capture richer information and nodal interactions, which improves their quality.
- Score: 12.592903558338444
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unsupervised (or self-supervised) graph representation learning is essential
to facilitate various graph data mining tasks when external supervision is
unavailable. The challenge is to encode the information about the graph
structure and the attributes associated with the nodes and edges into a low
dimensional space. Most existing unsupervised methods promote similar
representations across nodes that are topologically close. Recently, it was
shown that leveraging additional graph-level information, e.g., information
that is shared among all nodes, encourages the representations to be mindful of
the global properties of the graph, which greatly improves their quality.
However, in most graphs, there is significantly more structure that can be
captured, e.g., nodes tend to belong to (multiple) clusters that represent
structurally similar nodes. Motivated by this observation, we propose a graph
representation learning method called Graph InfoClust (GIC), that seeks to
additionally capture cluster-level information content. These clusters are
computed by a differentiable K-means method and are jointly optimized by
maximizing the mutual information between nodes of the same clusters. This
optimization leads the node representations to capture richer information and
nodal interactions, which improves their quality. Experiments show that GIC
outperforms state-of-art methods in various downstream tasks (node
classification, link prediction, and node clustering) with a 0.9% to 6.1% gain
over the best competing approach, on average.
Related papers
- Attributed Graph Clustering via Generalized Quaternion Representation Learning [38.98084537010657]
We propose a graph auto-encoder network, which introduces quaternion operations to the encoders to achieve efficient structured feature representation learning.
It turns out that the representations of nodes learned by the proposed Graph Clustering are more discriminative, containing global distribution information, and are more general, suiting downstream clustering under different $k$s.
arXiv Detail & Related papers (2024-11-22T04:46:31Z) - Cluster-based Graph Collaborative Filtering [55.929052969825825]
Graph Convolution Networks (GCNs) have succeeded in learning user and item representations for recommendation systems.
Most existing GCN-based methods overlook the multiple interests of users while performing high-order graph convolution.
We propose a novel GCN-based recommendation model, termed Cluster-based Graph Collaborative Filtering (ClusterGCF)
arXiv Detail & Related papers (2024-04-16T07:05:16Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Redundancy-Free Self-Supervised Relational Learning for Graph Clustering [13.176413653235311]
We propose a novel self-supervised deep graph clustering method named Redundancy-Free Graph Clustering (R$2$FGC)
It extracts the attribute- and structure-level relational information from both global and local views based on an autoencoder and a graph autoencoder.
Our experiments are performed on widely used benchmark datasets to validate the superiority of our R$2$FGC over state-of-the-art baselines.
arXiv Detail & Related papers (2023-09-09T06:18:50Z) - Graph Representation Learning via Contrasting Cluster Assignments [57.87743170674533]
We propose a novel unsupervised graph representation model by contrasting cluster assignments, called as GRCCA.
It is motivated to make good use of local and global information synthetically through combining clustering algorithms and contrastive learning.
GRCCA has strong competitiveness in most tasks.
arXiv Detail & Related papers (2021-12-15T07:28:58Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Accurate Learning of Graph Representations with Graph Multiset Pooling [45.72542969364438]
We propose a Graph Multiset Transformer (GMT) that captures the interaction between nodes according to their structural dependencies.
Our experimental results show that GMT significantly outperforms state-of-the-art graph pooling methods on graph classification benchmarks.
arXiv Detail & Related papers (2021-02-23T07:45:58Z) - CAGNN: Cluster-Aware Graph Neural Networks for Unsupervised Graph
Representation Learning [19.432449825536423]
Unsupervised graph representation learning aims to learn low-dimensional node embeddings without supervision.
We present a novel cluster-aware graph neural network (CAGNN) model for unsupervised graph representation learning using self-supervised techniques.
arXiv Detail & Related papers (2020-09-03T13:57:18Z) - Unsupervised Hierarchical Graph Representation Learning by Mutual
Information Maximization [8.14036521415919]
We present an unsupervised graph representation learning method, Unsupervised Hierarchical Graph Representation (UHGR)
Our method focuses on maximizing mutual information between "local" and high-level "global" representations.
The results show that the proposed method achieves comparable results to state-of-the-art supervised methods on several benchmarks.
arXiv Detail & Related papers (2020-03-18T18:21:48Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.