Analysis of Semi-Supervised Learning on Hypergraphs
- URL: http://arxiv.org/abs/2510.25354v1
- Date: Wed, 29 Oct 2025 10:19:32 GMT
- Title: Analysis of Semi-Supervised Learning on Hypergraphs
- Authors: Adrien Weihs, Andrea Bertozzi, Matthew Thorpe,
- Abstract summary: We propose Higher-Order Hypergraph Learning (HOHL), which regularizes via powers of Laplacians from skeleton graphs for multiscale smoothness.<n>HOHL converges to a higher-order Sobolev seminorm. Empirically, it performs strongly on standard baselines.
- Score: 1.8297494098768168
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hypergraphs provide a natural framework for modeling higher-order interactions, yet their theoretical underpinnings in semi-supervised learning remain limited. We provide an asymptotic consistency analysis of variational learning on random geometric hypergraphs, precisely characterizing the conditions ensuring the well-posedness of hypergraph learning as well as showing convergence to a weighted $p$-Laplacian equation. Motivated by this, we propose Higher-Order Hypergraph Learning (HOHL), which regularizes via powers of Laplacians from skeleton graphs for multiscale smoothness. HOHL converges to a higher-order Sobolev seminorm. Empirically, it performs strongly on standard baselines.
Related papers
- BHyGNN+: Unsupervised Representation Learning for Heterophilic Hypergraphs [50.161252392272324]
We introduce BHyGNN+, a self-supervised learning framework for representation learning on heterophilic hypergraphs.<n>By contrasting augmented views of a hypergraph against its dual using cosine similarity, our framework captures essential structural patterns in a fully unsupervised manner.<n>Our results validate the effectiveness of leveraging hypergraph duality for self-supervised learning.
arXiv Detail & Related papers (2026-02-16T16:55:37Z) - Higher-Order Regularization Learning on Hypergraphs [1.8297494098768168]
Higher-Order Hypergraph Learning (HOHL) was recently introduced as a principled alternative to classical hypergraph regularization.<n>We prove the consistency of a truncated version of HOHL and deriving explicit convergence rates when HOHL is used as a regularizer in fully supervised learning.
arXiv Detail & Related papers (2025-10-30T14:22:57Z) - Lower Ricci Curvature for Hypergraphs [3.9965784551765697]
We introduce hypergraph lower curvature (HLRC), a novel curvature metric defined in closed form that achieves a principled balance between interpretability and efficiency.<n>HLRC consistently reveals meaningful higher-order organization, distinguishing intra-community hyperedges, uncovering latent semantic labels, tracking temporal dynamics, and supporting robust clustering of hypergraphs based on global structure.
arXiv Detail & Related papers (2025-06-04T13:32:09Z) - Hypergraph Neural Sheaf Diffusion: A Symmetric Simplicial Set Framework for Higher-Order Learning [0.0]
We introduce Hypergraph Neural Sheaf Diffusion (HNSD), the first principled extension of neural sheaf diffusion to hypergraphs.<n>HNSD operates via normalized degree zero sheaf Laplacian over symmetric simplicial lifting, resolving orientation ambiguity and adjacency sparsity inherent to hypergraph learning.
arXiv Detail & Related papers (2025-05-09T00:26:38Z) - HyperGCT: A Dynamic Hyper-GNN-Learned Geometric Constraint for 3D Registration [60.01977041900338]
HyperGCT is a flexible dynamic Hyper-GNN-learned geometric ConstrainT.<n>It mines robust geometric constraints from dynamic hypergraphs for 3D registration.<n>Experiments on 3DMatch, 3DLoMatch, KITTI-LC, and ETH show that HyperGCT achieves state-of-the-art performance.
arXiv Detail & Related papers (2025-03-04T02:05:43Z) - Hypergraphs as Weighted Directed Self-Looped Graphs: Spectral Properties, Clustering, Cheeger Inequality [40.215737469808026]
Hypergraphs arise when studying group relations and have been widely used in the field of machine learning.
There has not been a unified formulation of hypergraphs, yet the recently proposed edge-dependent Rayleigh weights (EDVW) modeling is one of the most generalized modeling methods of hypergraphs.
We propose our definitions of hypergraph Quotient, NCut, boundary/cut, volume, and conductance, which are consistent with the corresponding definitions on graphs.
Then, we prove that the normalized hypergraph Laplacian is associated with the NCut value, which inspires our HyperClus-G algorithm for spectral clustering
arXiv Detail & Related papers (2024-10-23T05:16:48Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Nonparametric Modeling of Higher-Order Interactions via Hypergraphons [11.6503817521043]
We study statistical and algorithmic aspects of using hypergraphons, that are limits of large hypergraphs, for modeling higher-order interactions.
We consider a restricted class of Simple Lipschitz Hypergraphons (SLH), that are amenable to practically efficient estimation.
arXiv Detail & Related papers (2021-05-18T17:08:29Z) - Spatial-spectral Hyperspectral Image Classification via Multiple Random
Anchor Graphs Ensemble Learning [88.60285937702304]
This paper proposes a novel spatial-spectral HSI classification method via multiple random anchor graphs ensemble learning (RAGE)
Firstly, the local binary pattern is adopted to extract the more descriptive features on each selected band, which preserves local structures and subtle changes of a region.
Secondly, the adaptive neighbors assignment is introduced in the construction of anchor graph, to reduce the computational complexity.
arXiv Detail & Related papers (2021-03-25T09:31:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.