Generative hypergraph clustering: from blockmodels to modularity
- URL: http://arxiv.org/abs/2101.09611v2
- Date: Wed, 27 Jan 2021 21:26:58 GMT
- Title: Generative hypergraph clustering: from blockmodels to modularity
- Authors: Philip S. Chodrow, Nate Veldt, and Austin R. Benson
- Abstract summary: We propose an expressive generative model of clustered hypergraphs with heterogeneous node degrees and edge sizes.
We show that hypergraph Louvain is highly scalable, including as an example an experiment on a synthetic hypergraph of one million nodes.
We use our model to analyze different patterns of higher-order structure in school contact networks, U.S. congressional bill cosponsorship, U.S. congressional committees, product categories in co-purchasing behavior, and hotel locations.
- Score: 26.99290024958576
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hypergraphs are a natural modeling paradigm for a wide range of complex
relational systems with multibody interactions. A standard analysis task is to
identify clusters of closely related or densely interconnected nodes. While
many probabilistic generative models for graph clustering have been proposed,
there are relatively few such models for hypergraphs. We propose a Poisson
degree-corrected hypergraph stochastic blockmodel (DCHSBM), an expressive
generative model of clustered hypergraphs with heterogeneous node degrees and
edge sizes. Approximate maximum-likelihood inference in the DCHSBM naturally
leads to a clustering objective that generalizes the popular modularity
objective for graphs. We derive a general Louvain-type algorithm for this
objective, as well as a a faster, specialized "All-Or-Nothing" (AON) variant in
which edges are expected to lie fully within clusters. This special case
encompasses a recent proposal for modularity in hypergraphs, while also
incorporating flexible resolution and edge-size parameters. We show that
hypergraph Louvain is highly scalable, including as an example an experiment on
a synthetic hypergraph of one million nodes. We also demonstrate through
synthetic experiments that the detectability regimes for hypergraph community
detection differ from methods based on dyadic graph projections. In particular,
there are regimes in which hypergraph methods can recover planted partitions
even though graph based methods necessarily fail due to information-theoretic
limits. We use our model to analyze different patterns of higher-order
structure in school contact networks, U.S. congressional bill cosponsorship,
U.S. congressional committees, product categories in co-purchasing behavior,
and hotel locations from web browsing sessions, that it is able to recover
ground truth clusters in empirical data sets exhibiting the corresponding
higher-order structure.
Related papers
- SPHINX: Structural Prediction using Hypergraph Inference Network [19.853413818941608]
We introduce Structural Prediction using Hypergraph Inference Network (SPHINX), a model that learns to infer a latent hypergraph structure in an unsupervised way.
We show that the recent advancement in k-subset sampling represents a suitable tool for producing discrete hypergraph structures.
The resulting model can generate the higher-order structure necessary for any modern hypergraph neural network.
arXiv Detail & Related papers (2024-10-04T07:49:57Z) - HyperSMOTE: A Hypergraph-based Oversampling Approach for Imbalanced Node Classifications [2.172034690069702]
We propose HyperSMOTE as a solution to alleviate the class imbalance issue in hypergraph learning.
We synthesize new nodes based on samples from minority classes and their neighbors.
In order to solve the problem on integrating the new node into the hypergraph, we train a decoder based on the original hypergraph incidence matrix.
arXiv Detail & Related papers (2024-09-09T08:01:28Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Equivariant Hypergraph Diffusion Neural Operators [81.32770440890303]
Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data.
This work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators.
We evaluate ED-HNN for node classification on nine real-world hypergraph datasets.
arXiv Detail & Related papers (2022-07-14T06:17:00Z) - Nonbacktracking spectral clustering of nonuniform hypergraphs [2.408714894793063]
We study spectral clustering for nonuniform hypergraphs based on the hypergraph nonbacktracking operator.
We propose an alternating algorithm for inference in a hypergraph blockmodel via linearized belief-propagation.
arXiv Detail & Related papers (2022-04-27T01:14:06Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-12T02:07:07Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-10T12:37:55Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.