Information Limits for Detecting a Subhypergraph
- URL: http://arxiv.org/abs/2105.02259v1
- Date: Wed, 5 May 2021 18:08:42 GMT
- Title: Information Limits for Detecting a Subhypergraph
- Authors: Mingao Yuan, Zuofeng Shang
- Abstract summary: We consider the problem of recovering a subhypergraph based on an observed adjacency tensor corresponding to a uniform hypergraph.
We consider both weak recovery and exact recovery of the subhypergraph, and establish information-theoretic limits in each case.
- Score: 6.903929927172917
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider the problem of recovering a subhypergraph based on an observed
adjacency tensor corresponding to a uniform hypergraph. The uniform hypergraph
is assumed to contain a subset of vertices called as subhypergraph. The edges
restricted to the subhypergraph are assumed to follow a different probability
distribution than other edges. We consider both weak recovery and exact
recovery of the subhypergraph, and establish information-theoretic limits in
each case. Specifically, we establish sharp conditions for the possibility of
weakly or exactly recovering the subhypergraph from an information-theoretic
point of view. These conditions are fundamentally different from their
counterparts derived in hypothesis testing literature.
Related papers
- Towards Self-Interpretable Graph-Level Anomaly Detection [73.1152604947837]
Graph-level anomaly detection (GLAD) aims to identify graphs that exhibit notable dissimilarity compared to the majority in a collection.
We propose a Self-Interpretable Graph aNomaly dETection model ( SIGNET) that detects anomalous graphs as well as generates informative explanations simultaneously.
arXiv Detail & Related papers (2023-10-25T10:10:07Z) - Hypergraph Structure Inference From Data Under Smoothness Prior [46.568839316694515]
We propose a method to infer the probability for each potential hyperedge without labelled data as supervision.
We use this prior to derive the relation between the hypergraph structure and the node features via probabilistic modelling.
Experiments on both synthetic and real-world data demonstrate that our method can learn meaningful hypergraph structures from data more efficiently than existing hypergraph structure inference methods.
arXiv Detail & Related papers (2023-08-27T18:28:58Z) - Optimal and exact recovery on general non-uniform Hypergraph Stochastic Block Model [0.0]
We consider the community detection problem in random hypergraphs under the non-uniform hypergraphinger block model (HSBM)
We establish, for the first time in the literature, a sharp threshold for exact recovery under this non-uniform case, subject to minor constraints.
We provide two efficient algorithms which successfully achieve exact recovery when above the threshold, and attain the lowest possible ratio when the exact recovery is impossible.
arXiv Detail & Related papers (2023-04-25T20:30:33Z) - Resolving label uncertainty with implicit posterior models [71.62113762278963]
We propose a method for jointly inferring labels across a collection of data samples.
By implicitly assuming the existence of a generative model for which a differentiable predictor is the posterior, we derive a training objective that allows learning under weak beliefs.
arXiv Detail & Related papers (2022-02-28T18:09:44Z) - Exact Recovery in the General Hypergraph Stochastic Block Model [92.28929858529679]
This paper investigates fundamental limits of exact recovery in the general d-uniform hypergraph block model (d-HSBM)
We show that there exists a sharp threshold such that exact recovery is achievable above the threshold and impossible below it.
arXiv Detail & Related papers (2021-05-11T03:39:08Z) - Spectral clustering under degree heterogeneity: a case for the random
walk Laplacian [83.79286663107845]
This paper shows that graph spectral embedding using the random walk Laplacian produces vector representations which are completely corrected for node degree.
In the special case of a degree-corrected block model, the embedding concentrates about K distinct points, representing communities.
arXiv Detail & Related papers (2021-05-03T16:36:27Z) - Heterogeneous Dense Subhypergraph Detection [6.903929927172917]
We study the problem of testing the existence of a heterogeneous dense subhypergraph.
The null hypothesis corresponds to a heterogeneous Erd"os-R'enyi uniform random hypergraph.
The alternative hypothesis corresponds to a heterogeneous uniform random hypergraph that contains a dense subhypergraph.
arXiv Detail & Related papers (2021-04-08T20:44:22Z) - Recognizing Predictive Substructures with Subgraph Information
Bottleneck [97.19131149357234]
We propose a novel subgraph information bottleneck (SIB) framework to recognize such subgraphs, named IB-subgraph.
Intractability of mutual information and the discrete nature of graph data makes the objective of SIB notoriously hard to optimize.
Experiments on graph learning and large-scale point cloud tasks demonstrate the superior property of IB-subgraph.
arXiv Detail & Related papers (2021-03-20T11:19:43Z) - Sharp detection boundaries on testing dense subhypergraph [6.903929927172917]
We study the problem of testing the existence of a dense subhypergraph.
The null hypothesis is an Erdos-Renyi uniform random hypergraph.
The alternative hypothesis is a uniform random hypergraph that contains a dense subhypergraph.
arXiv Detail & Related papers (2021-01-12T16:31:47Z) - Hypergraph Partitioning using Tensor Eigenvalue Decomposition [19.01626581411011]
We propose a novel approach for the partitioning of k-uniform hypergraphs.
Most of the existing methods work by reducing the hypergraph to a graph followed by applying standard graph partitioning algorithms.
We overcome this issue by utilizing the tensor-based representation of hypergraphs.
arXiv Detail & Related papers (2020-11-16T01:55:43Z) - Semi-supervised Hypergraph Node Classification on Hypergraph Line
Expansion [7.933465724913661]
We propose a new hypergraph formulation named the emphline expansion (LE) for hypergraphs learning.
The proposed emphline expansion makes existing graph learning algorithms compatible with the higher-order structure.
We evaluate the proposed line expansion on five hypergraph datasets, the results show that our method beats SOTA baselines by a significant margin.
arXiv Detail & Related papers (2020-05-11T03:02:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.