Higher-Order Regularization Learning on Hypergraphs
- URL: http://arxiv.org/abs/2510.26533v1
- Date: Thu, 30 Oct 2025 14:22:57 GMT
- Title: Higher-Order Regularization Learning on Hypergraphs
- Authors: Adrien Weihs, Andrea Bertozzi, Matthew Thorpe,
- Abstract summary: Higher-Order Hypergraph Learning (HOHL) was recently introduced as a principled alternative to classical hypergraph regularization.<n>We prove the consistency of a truncated version of HOHL and deriving explicit convergence rates when HOHL is used as a regularizer in fully supervised learning.
- Score: 1.8297494098768168
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Higher-Order Hypergraph Learning (HOHL) was recently introduced as a principled alternative to classical hypergraph regularization, enforcing higher-order smoothness via powers of multiscale Laplacians induced by the hypergraph structure. Prior work established the well- and ill-posedness of HOHL through an asymptotic consistency analysis in geometric settings. We extend this theoretical foundation by proving the consistency of a truncated version of HOHL and deriving explicit convergence rates when HOHL is used as a regularizer in fully supervised learning. We further demonstrate its strong empirical performance in active learning and in datasets lacking an underlying geometric structure, highlighting HOHL's versatility and robustness across diverse learning settings.
Related papers
- Hyper-KGGen: A Skill-Driven Knowledge Extractor for High-Quality Knowledge Hypergraph Generation [63.4604143884703]
Hyper-KGGen is a skill-driven framework that reformulates extraction as a skill-evolving process.<n>It incorporates an textitadaptive skill acquisition module that actively distills domain expertise into a Global Skill Library.<n>We present textbfHyperDocRED, a rigorously annotated benchmark for document-level knowledge hypergraph extraction.
arXiv Detail & Related papers (2026-02-23T06:32:00Z) - Analysis of Semi-Supervised Learning on Hypergraphs [1.8297494098768168]
We propose Higher-Order Hypergraph Learning (HOHL), which regularizes via powers of Laplacians from skeleton graphs for multiscale smoothness.<n>HOHL converges to a higher-order Sobolev seminorm. Empirically, it performs strongly on standard baselines.
arXiv Detail & Related papers (2025-10-29T10:19:32Z) - Implicit Hypergraph Neural Networks: A Stable Framework for Higher-Order Relational Learning with Provable Guarantees [8.5183483099116]
We introduce Implicit Hypergraph Neural Networks (IHGNN), which computes representations as the solution to a nonlinear fixed-point equation.<n>IHGNN consistently outperforms strong traditional graph/hypergraph neural network baselines in both accuracy and robustness.
arXiv Detail & Related papers (2025-08-13T02:06:29Z) - Lower Ricci Curvature for Hypergraphs [3.9965784551765697]
We introduce hypergraph lower curvature (HLRC), a novel curvature metric defined in closed form that achieves a principled balance between interpretability and efficiency.<n>HLRC consistently reveals meaningful higher-order organization, distinguishing intra-community hyperedges, uncovering latent semantic labels, tracking temporal dynamics, and supporting robust clustering of hypergraphs based on global structure.
arXiv Detail & Related papers (2025-06-04T13:32:09Z) - Relational Learning in Pre-Trained Models: A Theory from Hypergraph Recovery Perspective [60.64922606733441]
We introduce a mathematical model that formalizes relational learning as hypergraph recovery to study pre-training of Foundation Models (FMs)
In our framework, the world is represented as a hypergraph, with data abstracted as random samples from hyperedges. We theoretically examine the feasibility of a Pre-Trained Model (PTM) to recover this hypergraph and analyze the data efficiency in a minimax near-optimal style.
arXiv Detail & Related papers (2024-06-17T06:20:39Z) - Learning from Heterogeneity: A Dynamic Learning Framework for Hypergraphs [22.64740740462169]
We propose a hypergraph learning framework named LFH that is capable of dynamic hyperedge construction and attentive embedding update.<n>To evaluate the effectiveness of our proposed framework, we conduct comprehensive experiments on several popular datasets.
arXiv Detail & Related papers (2023-07-07T06:26:44Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [61.49208407567829]
This survey paper provides a comprehensive review of the rapidly evolving field of Hyperbolic Graph Learning (HGL)<n>We systematically categorize and analyze existing methods dividing them into (1) hyperbolic graph embedding-based techniques, (2) graph neural network-based hyperbolic models, and (3) emerging paradigms.<n>We extensively discuss diverse applications of HGL across multiple domains, including recommender systems, knowledge graphs, bioinformatics, and other relevant scenarios.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Nonparametric Modeling of Higher-Order Interactions via Hypergraphons [11.6503817521043]
We study statistical and algorithmic aspects of using hypergraphons, that are limits of large hypergraphs, for modeling higher-order interactions.
We consider a restricted class of Simple Lipschitz Hypergraphons (SLH), that are amenable to practically efficient estimation.
arXiv Detail & Related papers (2021-05-18T17:08:29Z) - Spatial-spectral Hyperspectral Image Classification via Multiple Random
Anchor Graphs Ensemble Learning [88.60285937702304]
This paper proposes a novel spatial-spectral HSI classification method via multiple random anchor graphs ensemble learning (RAGE)
Firstly, the local binary pattern is adopted to extract the more descriptive features on each selected band, which preserves local structures and subtle changes of a region.
Secondly, the adaptive neighbors assignment is introduced in the construction of anchor graph, to reduce the computational complexity.
arXiv Detail & Related papers (2021-03-25T09:31:41Z) - HNHN: Hypergraph Networks with Hyperedge Neurons [90.15253035487314]
HNHN is a hypergraph convolution network with nonlinear activation functions applied to both hypernodes and hyperedges.
We demonstrate improved performance of HNHN in both classification accuracy and speed on real world datasets when compared to state of the art methods.
arXiv Detail & Related papers (2020-06-22T14:08:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.