HyperEF: Spectral Hypergraph Coarsening by Effective-Resistance
Clustering
- URL: http://arxiv.org/abs/2210.14813v1
- Date: Wed, 26 Oct 2022 16:01:24 GMT
- Title: HyperEF: Spectral Hypergraph Coarsening by Effective-Resistance
Clustering
- Authors: Ali Aghdaei, Zhuo Feng
- Abstract summary: This paper introduces a scalable algorithmic framework (HyperEF) for spectral coarsening (decomposition) of large-scale hypergraphs.
Motivated by the latest theoretical framework for low-resistance-diameter decomposition of simple graphs, HyperEF aims at decomposing large hypergraphs into multiple node clusters.
- Score: 7.6146285961466
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This paper introduces a scalable algorithmic framework (HyperEF) for spectral
coarsening (decomposition) of large-scale hypergraphs by exploiting hyperedge
effective resistances. Motivated by the latest theoretical framework for
low-resistance-diameter decomposition of simple graphs, HyperEF aims at
decomposing large hypergraphs into multiple node clusters with only a few
inter-cluster hyperedges. The key component in HyperEF is a nearly-linear time
algorithm for estimating hyperedge effective resistances, which allows
incorporating the latest diffusion-based non-linear quadratic operators defined
on hypergraphs. To achieve good runtime scalability, HyperEF searches within
the Krylov subspace (or approximate eigensubspace) for identifying the
nearly-optimal vectors for approximating the hyperedge effective resistances.
In addition, a node weight propagation scheme for multilevel spectral
hypergraph decomposition has been introduced for achieving even greater node
coarsening ratios. When compared with state-of-the-art hypergraph partitioning
(clustering) methods, extensive experiment results on real-world VLSI designs
show that HyperEF can more effectively coarsen (decompose) hypergraphs without
losing key structural (spectral) properties of the original hypergraphs, while
achieving over $70\times$ runtime speedups over hMetis and $20\times$ speedups
over HyperSF.
Related papers
- Scalable and Effective Negative Sample Generation for Hyperedge Prediction [55.9298019975967]
Hyperedge prediction is crucial for understanding complex multi-entity interactions in web-based applications.
Traditional methods often face difficulties in generating high-quality negative samples due to imbalance between positive and negative instances.
We present the scalable and effective negative sample generation for Hyperedge Prediction (SEHP) framework, which utilizes diffusion models to tackle these challenges.
arXiv Detail & Related papers (2024-11-19T09:16:25Z) - SHyPar: A Spectral Coarsening Approach to Hypergraph Partitioning [4.110108749051657]
We introduce a multilevel spectral framework, SHyPar, for large-scale hypergraphs by leveraging hyperedge effective resistances and flow-based community detection techniques.
A key component of SHyPar is a flow-based local clustering scheme for hypergraph coarsening, which incorporates a max-flow-based algorithm to produce nodes with substantially improved conductance.
arXiv Detail & Related papers (2024-10-09T03:29:47Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - A Unified View Between Tensor Hypergraph Neural Networks And Signal
Denoising [7.083679120873857]
We show that the tensor-hypergraph convolutional network (T-HGCN) has emerged as a powerful architecture for preserving higher-order interactions on hypergraphs.
We further design a tensor-hypergraph iterative network (T-HGIN) based on the HyperGSD problem, which takes advantage of a multi-step updating scheme in every single layer.
arXiv Detail & Related papers (2023-09-15T13:19:31Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Hypergraphs with Edge-Dependent Vertex Weights: p-Laplacians and
Spectral Clustering [29.400924845055865]
We study p-Laplacians and spectral clustering for a recently proposed hypergraph model that incorporates edge-dependent weights (EDVWs)
We convert hypergraphs with EDVWs into submodular hypergraphs for which the spectral theory is better developed.
Existing concepts and theorems such as p-Laplacians and Cheeger inequalities proposed under the submodular hypergraph setting can be directly extended to hypergraphs with EDVWs.
arXiv Detail & Related papers (2022-08-15T22:29:00Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - HyperSF: Spectral Hypergraph Coarsening via Flow-based Local Clustering [9.438207505148947]
We propose an efficient spectral hypergraph coarsening scheme (HyperSF) for preserving the original spectral (structural) properties of hypergraphs.
Our results show that the proposed hypergraph coarsening algorithm can significantly improve the multi-way conductance of hypergraph clustering.
arXiv Detail & Related papers (2021-08-17T22:20:23Z) - Learning Spatial-Spectral Prior for Super-Resolution of Hyperspectral
Imagery [79.69449412334188]
In this paper, we investigate how to adapt state-of-the-art residual learning based single gray/RGB image super-resolution approaches.
We introduce a spatial-spectral prior network (SSPN) to fully exploit the spatial information and the correlation between the spectra of the hyperspectral data.
Experimental results on some hyperspectral images demonstrate that the proposed SSPSR method enhances the details of the recovered high-resolution hyperspectral images.
arXiv Detail & Related papers (2020-05-18T14:25:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.