Hypergraph Modeling via Spectral Embedding Connection: Hypergraph Cut,
Weighted Kernel $k$-means, and Heat Kernel
- URL: http://arxiv.org/abs/2203.09888v1
- Date: Fri, 18 Mar 2022 11:51:59 GMT
- Title: Hypergraph Modeling via Spectral Embedding Connection: Hypergraph Cut,
Weighted Kernel $k$-means, and Heat Kernel
- Authors: Shota Saito
- Abstract summary: We propose a theoretical framework of multi-way similarity to model real-valued data into hypergraphs for clustering via spectral embedding.
Our algorithm empirically shows better performance than existing graph and other modeling methods.
- Score: 1.8528929583956726
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a theoretical framework of multi-way similarity to model
real-valued data into hypergraphs for clustering via spectral embedding. For
graph cut based spectral clustering, it is common to model real-valued data
into graph by modeling pairwise similarities using kernel function. This is
because the kernel function has a theoretical connection to the graph cut. For
problems where using multi-way similarities are more suitable than pairwise
ones, it is natural to model as a hypergraph, which is generalization of a
graph. However, although the hypergraph cut is well-studied, there is not yet
established a hypergraph cut based framework to model multi-way similarity. In
this paper, we formulate multi-way similarities by exploiting the theoretical
foundation of kernel function. We show a theoretical connection between our
formulation and hypergraph cut in two ways, generalizing both weighted kernel
$k$-means and the heat kernel, by which we justify our formulation. We also
provide a fast algorithm for spectral clustering. Our algorithm empirically
shows better performance than existing graph and other heuristic modeling
methods.
Related papers
- From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Core-periphery Models for Hypergraphs [0.0]
We introduce a random hypergraph model for core-periphery structure.
We develop a novel statistical inference algorithm that is able to scale to large hypergraphs with runtime that is practically linear wrt.
Our inference algorithm is capable of learning embeddings that correspond to the reputation (rank) of a node within the hypergraph.
arXiv Detail & Related papers (2022-06-01T22:11:44Z) - Nonbacktracking spectral clustering of nonuniform hypergraphs [2.408714894793063]
We study spectral clustering for nonuniform hypergraphs based on the hypergraph nonbacktracking operator.
We propose an alternating algorithm for inference in a hypergraph blockmodel via linearized belief-propagation.
arXiv Detail & Related papers (2022-04-27T01:14:06Z) - Effects of Graph Convolutions in Deep Networks [8.937905773981702]
We present a rigorous theoretical understanding of the effects of graph convolutions in multi-layer networks.
We show that a single graph convolution expands the regime of the distance between the means where multi-layer networks can classify the data.
We provide both theoretical and empirical insights into the performance of graph convolutions placed in different combinations among the layers of a network.
arXiv Detail & Related papers (2022-04-20T08:24:43Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Non-separable Spatio-temporal Graph Kernels via SPDEs [90.62347738138594]
We provide graph kernels for principled-temporal modelling on graphs.
By providing novel tools for modelling on graphs, we outperform pre-existing graph kernels in real-world applications.
arXiv Detail & Related papers (2021-11-16T14:53:19Z) - Generative hypergraph clustering: from blockmodels to modularity [26.99290024958576]
We propose an expressive generative model of clustered hypergraphs with heterogeneous node degrees and edge sizes.
We show that hypergraph Louvain is highly scalable, including as an example an experiment on a synthetic hypergraph of one million nodes.
We use our model to analyze different patterns of higher-order structure in school contact networks, U.S. congressional bill cosponsorship, U.S. congressional committees, product categories in co-purchasing behavior, and hotel locations.
arXiv Detail & Related papers (2021-01-24T00:25:22Z) - Transport based Graph Kernels [30.541423115387097]
Graph kernel is a powerful tool measuring the similarity between graphs.
Most of the existing graph kernels focused on node labels or attributes and ignored graph hierarchical structure information.
We propose pyramid graph kernel based on optimal transport (OT)
We evaluate the proposed graph kernels on several benchmark classification tasks and compare their performance with the existing state-of-the-art graph kernels.
arXiv Detail & Related papers (2020-11-02T04:44:27Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.