ClusterFuG: Clustering Fully connected Graphs by Multicut
- URL: http://arxiv.org/abs/2301.12159v2
- Date: Mon, 5 Jun 2023 08:56:05 GMT
- Title: ClusterFuG: Clustering Fully connected Graphs by Multicut
- Authors: Ahmed Abbas and Paul Swoboda
- Abstract summary: In dense multicut, the clustering objective is given in a factorized form as inner products of node feature vectors.
We show how to rewrite classical greedy algorithms for multicut in our dense setting and how to modify them for greater efficiency and solution quality.
- Score: 20.254912065749956
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a graph clustering formulation based on multicut (a.k.a. weighted
correlation clustering) on the complete graph. Our formulation does not need
specification of the graph topology as in the original sparse formulation of
multicut, making our approach simpler and potentially better performing. In
contrast to unweighted correlation clustering we allow for a more expressive
weighted cost structure. In dense multicut, the clustering objective is given
in a factorized form as inner products of node feature vectors. This allows for
an efficient formulation and inference in contrast to multicut/weighted
correlation clustering, which has at least quadratic representation and
computation complexity when working on the complete graph. We show how to
rewrite classical greedy algorithms for multicut in our dense setting and how
to modify them for greater efficiency and solution quality. In particular, our
algorithms scale to graphs with tens of thousands of nodes. Empirical evidence
on instance segmentation on Cityscapes and clustering of ImageNet datasets
shows the merits of our approach.
Related papers
- Multi-order Graph Clustering with Adaptive Node-level Weight Learning [8.975255910740646]
We propose a multi-order graph clustering model (MOGC) to integrate multiple higher-order structures and edge connections at node level.
MOGC employs an adaptive weight learning mechanism to adjust the contributions of different motifs for each node.
Experiments on seven real-world datasets illustrate the effectiveness of MOGC.
arXiv Detail & Related papers (2024-05-20T17:09:58Z) - Cluster-based Graph Collaborative Filtering [55.929052969825825]
Graph Convolution Networks (GCNs) have succeeded in learning user and item representations for recommendation systems.
Most existing GCN-based methods overlook the multiple interests of users while performing high-order graph convolution.
We propose a novel GCN-based recommendation model, termed Cluster-based Graph Collaborative Filtering (ClusterGCF)
arXiv Detail & Related papers (2024-04-16T07:05:16Z) - MeanCut: A Greedy-Optimized Graph Clustering via Path-based Similarity
and Degree Descent Criterion [0.6906005491572401]
spectral clustering is popular and attractive due to the remarkable performance, easy implementation, and strong adaptability.
We propose MeanCut as the objective function and greedily optimize it in degree descending order for a nondestructive graph partition.
The validity of our algorithm is demonstrated by testifying on real-world benchmarks and application of face recognition.
arXiv Detail & Related papers (2023-12-07T06:19:39Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - Latent Random Steps as Relaxations of Max-Cut, Min-Cut, and More [30.919536115917726]
We present a probabilistic model based on non-negative matrix factorization which unifies clustering and simplification.
By relaxing the hard clustering to a soft clustering, our algorithm relaxes potentially hard clustering problems to a tractable ones.
arXiv Detail & Related papers (2023-08-12T02:47:57Z) - Dink-Net: Neural Clustering on Large Graphs [59.10189693120368]
A deep graph clustering method (Dink-Net) is proposed with the idea of dilation and shrink.
By discriminating nodes, whether being corrupted by augmentations, representations are learned in a self-supervised manner.
The clustering distribution is optimized by minimizing the proposed cluster dilation loss and cluster shrink loss.
Compared to the runner-up, Dink-Net 9.62% achieves NMI improvement on the ogbn-papers100M dataset with 111 million nodes and 1.6 billion edges.
arXiv Detail & Related papers (2023-05-28T15:33:24Z) - Fine-grained Graph Learning for Multi-view Subspace Clustering [2.4094285826152593]
We propose a fine-grained graph learning framework for multi-view subspace clustering (FGL-MSC)
The main challenge is how to optimize the fine-grained fusion weights while generating the learned graph that fits the clustering task.
Experiments on eight real-world datasets show that the proposed framework has comparable performance to the state-of-the-art methods.
arXiv Detail & Related papers (2022-01-12T18:00:29Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Structured Graph Learning for Scalable Subspace Clustering: From
Single-view to Multi-view [28.779909990410978]
Graph-based subspace clustering methods have exhibited promising performance.
They still suffer some of these drawbacks: encounter the expensive time overhead, fail in exploring the explicit clusters, and cannot generalize to unseen data points.
We propose a scalable graph learning framework, seeking to address the above three challenges simultaneously.
arXiv Detail & Related papers (2021-02-16T03:46:11Z) - Structured Graph Learning for Clustering and Semi-supervised
Classification [74.35376212789132]
We propose a graph learning framework to preserve both the local and global structure of data.
Our method uses the self-expressiveness of samples to capture the global structure and adaptive neighbor approach to respect the local structure.
Our model is equivalent to a combination of kernel k-means and k-means methods under certain condition.
arXiv Detail & Related papers (2020-08-31T08:41:20Z) - Adaptive Graph Auto-Encoder for General Data Clustering [90.8576971748142]
Graph-based clustering plays an important role in the clustering area.
Recent studies about graph convolution neural networks have achieved impressive success on graph type data.
We propose a graph auto-encoder for general data clustering, which constructs the graph adaptively according to the generative perspective of graphs.
arXiv Detail & Related papers (2020-02-20T10:11:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.