Minimizing Localized Ratio Cut Objectives in Hypergraphs
- URL: http://arxiv.org/abs/2002.09441v2
- Date: Wed, 1 Jul 2020 02:48:48 GMT
- Title: Minimizing Localized Ratio Cut Objectives in Hypergraphs
- Authors: Nate Veldt and Austin R. Benson and Jon Kleinberg
- Abstract summary: We present a framework for local hypergraph clustering based on minimizing localized ratio cut objectives.
Our algorithm is strongly-local, meaning that its runtime depends only on the size of the input set, and does not need to explore the entire hypergraph to find good local clusters.
- Score: 32.80813008862995
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hypergraphs are a useful abstraction for modeling multiway relationships in
data, and hypergraph clustering is the task of detecting groups of closely
related nodes in such data. Graph clustering has been studied extensively, and
there are numerous methods for detecting small, localized clusters without
having to explore an entire input graph. However, there are only a few
specialized approaches for localized clustering in hypergraphs. Here we present
a framework for local hypergraph clustering based on minimizing localized ratio
cut objectives. Our framework takes an input set of reference nodes in a
hypergraph and solves a sequence of hypergraph minimum $s$-$t$ cut problems in
order to identify a nearby well-connected cluster of nodes that overlaps
substantially with the input set.
Our methods extend graph-based techniques but are significantly more general
and have new output quality guarantees. First, our methods can minimize new
generalized notions of hypergraph cuts, which depend on specific configurations
of nodes within each hyperedge, rather than just on the number of cut
hyperedges. Second, our framework has several attractive theoretical properties
in terms of output cluster quality. Most importantly, our algorithm is
strongly-local, meaning that its runtime depends only on the size of the input
set, and does not need to explore the entire hypergraph to find good local
clusters. We use our methodology to effectively identify clusters in
hypergraphs of real-world data with millions of nodes, millions of hyperedges,
and large average hyperedge size with runtimes ranging between a few seconds
and a few minutes.
Related papers
- Hyperedge Modeling in Hypergraph Neural Networks by using Densest Overlapping Subgraphs [0.0]
One of the most important problems in graph clustering is to find densest overlapping subgraphs (DOS)
In this paper, we propose a solution to the DOS problem via Agglomerativedyion (DOSAGE) algorithm as a novel approach to enhance the process of generating the densest overlapping subgraphs.
Experiments on standard benchmarks show that the DOSAGE algorithm significantly outperforms the HGNNs and six other methods on the node classification task.
arXiv Detail & Related papers (2024-09-16T14:56:10Z) - DGCLUSTER: A Neural Framework for Attributed Graph Clustering via
Modularity Maximization [5.329981192545312]
We propose a novel method, DGCluster, which primarily optimize the modularity objective using graph neural networks and scales linearly with the graph size.
We extensively test DGCluster on several real-world datasets of varying sizes, across multiple popular cluster quality metrics.
Our approach consistently outperforms the state-of-the-art methods, demonstrating significant performance gains in almost all settings.
arXiv Detail & Related papers (2023-12-20T01:43:55Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - BOURNE: Bootstrapped Self-supervised Learning Framework for Unified
Graph Anomaly Detection [50.26074811655596]
We propose a novel unified graph anomaly detection framework based on bootstrapped self-supervised learning (named BOURNE)
By swapping the context embeddings between nodes and edges, we enable the mutual detection of node and edge anomalies.
BOURNE can eliminate the need for negative sampling, thereby enhancing its efficiency in handling large graphs.
arXiv Detail & Related papers (2023-07-28T00:44:57Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Dink-Net: Neural Clustering on Large Graphs [59.10189693120368]
A deep graph clustering method (Dink-Net) is proposed with the idea of dilation and shrink.
By discriminating nodes, whether being corrupted by augmentations, representations are learned in a self-supervised manner.
The clustering distribution is optimized by minimizing the proposed cluster dilation loss and cluster shrink loss.
Compared to the runner-up, Dink-Net 9.62% achieves NMI improvement on the ogbn-papers100M dataset with 111 million nodes and 1.6 billion edges.
arXiv Detail & Related papers (2023-05-28T15:33:24Z) - Graph-based Semi-supervised Local Clustering with Few Labeled Nodes [6.493238575291165]
Local clustering aims at extracting a local structure inside a graph without the necessity of knowing the entire graph structure.
We propose a new semi-supervised local clustering approach using only few labeled nodes.
arXiv Detail & Related papers (2022-11-20T22:55:07Z) - Self-supervised Contrastive Attributed Graph Clustering [110.52694943592974]
We propose a novel attributed graph clustering network, namely Self-supervised Contrastive Attributed Graph Clustering (SCAGC)
In SCAGC, by leveraging inaccurate clustering labels, a self-supervised contrastive loss, are designed for node representation learning.
For the OOS nodes, SCAGC can directly calculate their clustering labels.
arXiv Detail & Related papers (2021-10-15T03:25:28Z) - HyperSF: Spectral Hypergraph Coarsening via Flow-based Local Clustering [9.438207505148947]
We propose an efficient spectral hypergraph coarsening scheme (HyperSF) for preserving the original spectral (structural) properties of hypergraphs.
Our results show that the proposed hypergraph coarsening algorithm can significantly improve the multi-way conductance of hypergraph clustering.
arXiv Detail & Related papers (2021-08-17T22:20:23Z) - Community Detection in General Hypergraph via Graph Embedding [1.4213973379473654]
We propose a novel method for detecting community structure in general hypergraph networks, uniform or non-uniform.
The proposed method introduces a null to augment a non-uniform hypergraph into a uniform multi-hypergraph, and then embeds the multi-hypergraph in a low-dimensional vector space.
arXiv Detail & Related papers (2021-03-28T03:23:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.