Total Variation Graph Neural Networks
- URL: http://arxiv.org/abs/2211.06218v2
- Date: Thu, 27 Apr 2023 20:29:12 GMT
- Title: Total Variation Graph Neural Networks
- Authors: Jonas Berg Hansen and Filippo Maria Bianchi
- Abstract summary: Recently proposed Graph Neural Networks (GNNs) are trained with an unsupervised minimum cut objective.
We propose a GNN model that computes cluster assignments by optimizing a tighter relaxation of the minimum cut.
- Score: 5.571369922847262
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently proposed Graph Neural Networks (GNNs) for vertex clustering are
trained with an unsupervised minimum cut objective, approximated by a Spectral
Clustering (SC) relaxation. However, the SC relaxation is loose and, while it
offers a closed-form solution, it also yields overly smooth cluster assignments
that poorly separate the vertices. In this paper, we propose a GNN model that
computes cluster assignments by optimizing a tighter relaxation of the minimum
cut based on graph total variation (GTV). The cluster assignments can be used
directly to perform vertex clustering or to implement graph pooling in a graph
classification framework. Our model consists of two core components: i) a
message-passing layer that minimizes the $\ell_1$ distance in the features of
adjacent vertices, which is key to achieving sharp transitions between
clusters; ii) an unsupervised loss function that minimizes the GTV of the
cluster assignments while ensuring balanced partitions. Experimental results
show that our model outperforms other GNNs for vertex clustering and graph
classification.
Related papers
- Cluster-based Graph Collaborative Filtering [55.929052969825825]
Graph Convolution Networks (GCNs) have succeeded in learning user and item representations for recommendation systems.
Most existing GCN-based methods overlook the multiple interests of users while performing high-order graph convolution.
We propose a novel GCN-based recommendation model, termed Cluster-based Graph Collaborative Filtering (ClusterGCF)
arXiv Detail & Related papers (2024-04-16T07:05:16Z) - Learning Uniform Clusters on Hypersphere for Deep Graph-level Clustering [25.350054742471816]
We propose a novel deep graph-level clustering method called Uniform Deep Graph Clustering (UDGC)
UDGC assigns instances evenly to different clusters and then scatters those clusters on unit hypersphere, leading to a more uniform cluster-level distribution and a slighter cluster collapse.
Our empirical study on eight well-known datasets demonstrates that UDGC significantly outperforms the state-of-the-art models.
arXiv Detail & Related papers (2023-11-23T12:08:20Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - Dink-Net: Neural Clustering on Large Graphs [59.10189693120368]
A deep graph clustering method (Dink-Net) is proposed with the idea of dilation and shrink.
By discriminating nodes, whether being corrupted by augmentations, representations are learned in a self-supervised manner.
The clustering distribution is optimized by minimizing the proposed cluster dilation loss and cluster shrink loss.
Compared to the runner-up, Dink-Net 9.62% achieves NMI improvement on the ogbn-papers100M dataset with 111 million nodes and 1.6 billion edges.
arXiv Detail & Related papers (2023-05-28T15:33:24Z) - GLCC: A General Framework for Graph-level Clustering [5.069852282550117]
This paper studies the problem of graph-level clustering, which is a novel yet challenging task.
We propose a general graph-level clustering framework named Graph-Level Contrastive Clustering (GLCC)
Experiments on a range of well-known datasets demonstrate the superiority of our proposed GLCC over competitive baselines.
arXiv Detail & Related papers (2022-10-21T11:08:10Z) - Self-supervised Contrastive Attributed Graph Clustering [110.52694943592974]
We propose a novel attributed graph clustering network, namely Self-supervised Contrastive Attributed Graph Clustering (SCAGC)
In SCAGC, by leveraging inaccurate clustering labels, a self-supervised contrastive loss, are designed for node representation learning.
For the OOS nodes, SCAGC can directly calculate their clustering labels.
arXiv Detail & Related papers (2021-10-15T03:25:28Z) - Learning Hierarchical Graph Neural Networks for Image Clustering [81.5841862489509]
We propose a hierarchical graph neural network (GNN) model that learns how to cluster a set of images into an unknown number of identities.
Our hierarchical GNN uses a novel approach to merge connected components predicted at each level of the hierarchy to form a new graph at the next level.
arXiv Detail & Related papers (2021-07-03T01:28:42Z) - Seeing All From a Few: Nodes Selection Using Graph Pooling for Graph
Clustering [37.68977275752782]
noisy edges and nodes in the graph may make the clustering results worse.
We propose a novel dual graph embedding network(DGEN) to improve the robustness of the graph clustering to the noisy nodes and edges.
Experiments on three benchmark graph datasets demonstrate the superiority compared with several state-of-the-art algorithms.
arXiv Detail & Related papers (2021-04-30T06:51:51Z) - CAGNN: Cluster-Aware Graph Neural Networks for Unsupervised Graph
Representation Learning [19.432449825536423]
Unsupervised graph representation learning aims to learn low-dimensional node embeddings without supervision.
We present a novel cluster-aware graph neural network (CAGNN) model for unsupervised graph representation learning using self-supervised techniques.
arXiv Detail & Related papers (2020-09-03T13:57:18Z) - Local Graph Clustering with Network Lasso [90.66817876491052]
We study the statistical and computational properties of a network Lasso method for local graph clustering.
The clusters delivered by nLasso can be characterized elegantly via network flows between cluster boundary and seed nodes.
arXiv Detail & Related papers (2020-04-25T17:52:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.