Hypergraphs with Edge-Dependent Vertex Weights: p-Laplacians and
Spectral Clustering
- URL: http://arxiv.org/abs/2208.07457v1
- Date: Mon, 15 Aug 2022 22:29:00 GMT
- Title: Hypergraphs with Edge-Dependent Vertex Weights: p-Laplacians and
Spectral Clustering
- Authors: Yu Zhu and Santiago Segarra
- Abstract summary: We study p-Laplacians and spectral clustering for a recently proposed hypergraph model that incorporates edge-dependent weights (EDVWs)
We convert hypergraphs with EDVWs into submodular hypergraphs for which the spectral theory is better developed.
Existing concepts and theorems such as p-Laplacians and Cheeger inequalities proposed under the submodular hypergraph setting can be directly extended to hypergraphs with EDVWs.
- Score: 29.400924845055865
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study p-Laplacians and spectral clustering for a recently proposed
hypergraph model that incorporates edge-dependent vertex weights (EDVWs). These
weights can reflect different importance of vertices within a hyperedge, thus
conferring the hypergraph model higher expressivity and flexibility. By
constructing submodular EDVWs-based splitting functions, we convert hypergraphs
with EDVWs into submodular hypergraphs for which the spectral theory is better
developed. In this way, existing concepts and theorems such as p-Laplacians and
Cheeger inequalities proposed under the submodular hypergraph setting can be
directly extended to hypergraphs with EDVWs. For submodular hypergraphs with
EDVWs-based splitting functions, we propose an efficient algorithm to compute
the eigenvector associated with the second smallest eigenvalue of the
hypergraph 1-Laplacian. We then utilize this eigenvector to cluster the
vertices, achieving higher clustering accuracy than traditional spectral
clustering based on the 2-Laplacian. More broadly, the proposed algorithm works
for all submodular hypergraphs that are graph reducible. Numerical experiments
using real-world data demonstrate the effectiveness of combining spectral
clustering based on the 1-Laplacian and EDVWs.
Related papers
- Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Hypergraphs with Edge-Dependent Vertex Weights: Spectral Clustering
based on the 1-Laplacian [24.88719567631694]
We propose a flexible framework for defining the 1-Laplacian of a hypergraph that incorporates edge-dependent weights.
We show that for a special case within our framework, the corresponding hypergraph 1-Laplacian is equivalent to the 1-Laplacian of a related graph.
arXiv Detail & Related papers (2023-04-30T12:21:42Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - Nonbacktracking spectral clustering of nonuniform hypergraphs [2.408714894793063]
We study spectral clustering for nonuniform hypergraphs based on the hypergraph nonbacktracking operator.
We propose an alternating algorithm for inference in a hypergraph blockmodel via linearized belief-propagation.
arXiv Detail & Related papers (2022-04-27T01:14:06Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - HyperSF: Spectral Hypergraph Coarsening via Flow-based Local Clustering [9.438207505148947]
We propose an efficient spectral hypergraph coarsening scheme (HyperSF) for preserving the original spectral (structural) properties of hypergraphs.
Our results show that the proposed hypergraph coarsening algorithm can significantly improve the multi-way conductance of hypergraph clustering.
arXiv Detail & Related papers (2021-08-17T22:20:23Z) - Co-clustering Vertices and Hyperedges via Spectral Hypergraph
Partitioning [18.800058655626696]
We propose a novel method to co-cluster the vertices and hyperedges of hypergraphs with edge-dependent weights (EDVWs)
In our method, we leverage random walks with EDVWs to construct a hypergraph Laplacian and use its spectral properties to embed vertices and hyperedges in a common space.
We then cluster these embeddings to obtain our proposed co-clustering method, of particular relevance in applications requiring the simultaneous clustering of data entities and features.
arXiv Detail & Related papers (2021-02-19T21:47:39Z) - Generative hypergraph clustering: from blockmodels to modularity [26.99290024958576]
We propose an expressive generative model of clustered hypergraphs with heterogeneous node degrees and edge sizes.
We show that hypergraph Louvain is highly scalable, including as an example an experiment on a synthetic hypergraph of one million nodes.
We use our model to analyze different patterns of higher-order structure in school contact networks, U.S. congressional bill cosponsorship, U.S. congressional committees, product categories in co-purchasing behavior, and hotel locations.
arXiv Detail & Related papers (2021-01-24T00:25:22Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.