Hypergraphs with Edge-Dependent Vertex Weights: Spectral Clustering
based on the 1-Laplacian
- URL: http://arxiv.org/abs/2305.00462v1
- Date: Sun, 30 Apr 2023 12:21:42 GMT
- Title: Hypergraphs with Edge-Dependent Vertex Weights: Spectral Clustering
based on the 1-Laplacian
- Authors: Yu Zhu, Boning Li, Santiago Segarra
- Abstract summary: We propose a flexible framework for defining the 1-Laplacian of a hypergraph that incorporates edge-dependent weights.
We show that for a special case within our framework, the corresponding hypergraph 1-Laplacian is equivalent to the 1-Laplacian of a related graph.
- Score: 24.88719567631694
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a flexible framework for defining the 1-Laplacian of a hypergraph
that incorporates edge-dependent vertex weights. These weights are able to
reflect varying importance of vertices within a hyperedge, thus conferring the
hypergraph model higher expressivity than homogeneous hypergraphs. We then
utilize the eigenvector associated with the second smallest eigenvalue of the
hypergraph 1-Laplacian to cluster the vertices. From a theoretical standpoint
based on an adequately defined normalized Cheeger cut, this procedure is
expected to achieve higher clustering accuracy than that based on the
traditional Laplacian. Indeed, we confirm that this is the case using
real-world datasets to demonstrate the effectiveness of the proposed spectral
clustering approach. Moreover, we show that for a special case within our
framework, the corresponding hypergraph 1-Laplacian is equivalent to the
1-Laplacian of a related graph, whose eigenvectors can be computed more
efficiently, facilitating the adoption on larger datasets.
Related papers
- Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Hypergraph Structure Inference From Data Under Smoothness Prior [46.568839316694515]
We propose a method to infer the probability for each potential hyperedge without labelled data as supervision.
We use this prior to derive the relation between the hypergraph structure and the node features via probabilistic modelling.
Experiments on both synthetic and real-world data demonstrate that our method can learn meaningful hypergraph structures from data more efficiently than existing hypergraph structure inference methods.
arXiv Detail & Related papers (2023-08-27T18:28:58Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Hypergraphs with Edge-Dependent Vertex Weights: p-Laplacians and
Spectral Clustering [29.400924845055865]
We study p-Laplacians and spectral clustering for a recently proposed hypergraph model that incorporates edge-dependent weights (EDVWs)
We convert hypergraphs with EDVWs into submodular hypergraphs for which the spectral theory is better developed.
Existing concepts and theorems such as p-Laplacians and Cheeger inequalities proposed under the submodular hypergraph setting can be directly extended to hypergraphs with EDVWs.
arXiv Detail & Related papers (2022-08-15T22:29:00Z) - Semi-Supervised Subspace Clustering via Tensor Low-Rank Representation [64.49871502193477]
We propose a novel semi-supervised subspace clustering method, which is able to simultaneously augment the initial supervisory information and construct a discriminative affinity matrix.
Comprehensive experimental results on six commonly-used benchmark datasets demonstrate the superiority of our method over state-of-the-art methods.
arXiv Detail & Related papers (2022-05-21T01:47:17Z) - Nonbacktracking spectral clustering of nonuniform hypergraphs [2.408714894793063]
We study spectral clustering for nonuniform hypergraphs based on the hypergraph nonbacktracking operator.
We propose an alternating algorithm for inference in a hypergraph blockmodel via linearized belief-propagation.
arXiv Detail & Related papers (2022-04-27T01:14:06Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Co-clustering Vertices and Hyperedges via Spectral Hypergraph
Partitioning [18.800058655626696]
We propose a novel method to co-cluster the vertices and hyperedges of hypergraphs with edge-dependent weights (EDVWs)
In our method, we leverage random walks with EDVWs to construct a hypergraph Laplacian and use its spectral properties to embed vertices and hyperedges in a common space.
We then cluster these embeddings to obtain our proposed co-clustering method, of particular relevance in applications requiring the simultaneous clustering of data entities and features.
arXiv Detail & Related papers (2021-02-19T21:47:39Z) - Generative hypergraph clustering: from blockmodels to modularity [26.99290024958576]
We propose an expressive generative model of clustered hypergraphs with heterogeneous node degrees and edge sizes.
We show that hypergraph Louvain is highly scalable, including as an example an experiment on a synthetic hypergraph of one million nodes.
We use our model to analyze different patterns of higher-order structure in school contact networks, U.S. congressional bill cosponsorship, U.S. congressional committees, product categories in co-purchasing behavior, and hotel locations.
arXiv Detail & Related papers (2021-01-24T00:25:22Z) - Learning over Families of Sets -- Hypergraph Representation Learning for
Higher Order Tasks [12.28143554382742]
We develop a hypergraph neural network to learn provably expressive representations of variable sized hyperedges.
We evaluate performance on multiple real-world hypergraph datasets and demonstrate consistent, significant improvement in accuracy, over state-of-the-art models.
arXiv Detail & Related papers (2021-01-19T18:37:50Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.