SoftHGNN: Soft Hypergraph Neural Networks for General Visual Recognition
- URL: http://arxiv.org/abs/2505.15325v1
- Date: Wed, 21 May 2025 10:01:30 GMT
- Title: SoftHGNN: Soft Hypergraph Neural Networks for General Visual Recognition
- Authors: Mengqi Lei, Yihong Wu, Siqi Li, Xinhu Zheng, Juan Wang, Yue Gao, Shaoyi Du,
- Abstract summary: Hypergraphs extend conventional graphs by modeling high-order interactions.<n>Existing hypergraph neural networks typically rely on static and hard hyperedge assignments.<n>We present Soft Hypergraph Neural Networks (SoftHGNNs), which extend the methodology of hypergraph computation.
- Score: 19.88933136559363
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Visual recognition relies on understanding both the semantics of image tokens and the complex interactions among them. Mainstream self-attention methods, while effective at modeling global pair-wise relations, fail to capture high-order associations inherent in real-world scenes and often suffer from redundant computation. Hypergraphs extend conventional graphs by modeling high-order interactions and offer a promising framework for addressing these limitations. However, existing hypergraph neural networks typically rely on static and hard hyperedge assignments, leading to excessive and redundant hyperedges with hard binary vertex memberships that overlook the continuity of visual semantics. To overcome these issues, we present Soft Hypergraph Neural Networks (SoftHGNNs), which extend the methodology of hypergraph computation, to make it truly efficient and versatile in visual recognition tasks. Our framework introduces the concept of soft hyperedges, where each vertex is associated with hyperedges via continuous participation weights rather than hard binary assignments. This dynamic and differentiable association is achieved by using the learnable hyperedge prototype. Through similarity measurements between token features and the prototype, the model generates semantically rich soft hyperedges. SoftHGNN then aggregates messages over soft hyperedges to capture high-order semantics. To further enhance efficiency when scaling up the number of soft hyperedges, we incorporate a sparse hyperedge selection mechanism that activates only the top-k important hyperedges, along with a load-balancing regularizer to ensure balanced hyperedge utilization. Experimental results across three tasks on five datasets demonstrate that SoftHGNN efficiently captures high-order associations in visual scenes, achieving significant performance improvements.
Related papers
- HyperGCT: A Dynamic Hyper-GNN-Learned Geometric Constraint for 3D Registration [60.01977041900338]
HyperGCT is a flexible dynamic Hyper-GNN-learned geometric ConstrainT.<n>It mines robust geometric constraints from dynamic hypergraphs for 3D registration.<n>Experiments on 3DMatch, 3DLoMatch, KITTI-LC, and ETH show that HyperGCT achieves state-of-the-art performance.
arXiv Detail & Related papers (2025-03-04T02:05:43Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Self-Supervised Pretraining for Heterogeneous Hypergraph Neural Networks [9.987252149421982]
We present a novel self-supervised pretraining framework for heterogeneous HyperGNNs.
Our method is able to effectively capture higher-order relations among entities in the data in a self-supervised manner.
Our experiments show that our proposed framework consistently outperforms state-of-the-art baselines in various downstream tasks.
arXiv Detail & Related papers (2023-11-19T16:34:56Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Equivariant Hypergraph Neural Networks [15.096429849049953]
Recent approaches for hypergraph learning extend graph neural networks based on message passing, which is simple yet fundamentally limited in modeling long-range dependencies and expressive power.
We present Equivariant Hypergraph Neural Network (EHNN), the first attempt to realize maximally expressive equivariant layers for general hypergraph learning.
We demonstrate their capability in a range of hypergraph learning problems, including synthetic k-edge identification, semi-supervised classification, and visual keypoint matching, and report improved performances over strong message passing baselines.
arXiv Detail & Related papers (2022-08-22T16:28:38Z) - Principled inference of hyperedges and overlapping communities in
hypergraphs [0.0]
We propose a framework based on statistical inference to characterize the structural organization of hypergraphs.
We show strong performance in hyperedge prediction tasks, detecting communities well aligned with the information carried by interactions, and robustness against addition of noisy hyperedges.
arXiv Detail & Related papers (2022-04-12T09:13:46Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Learning over Families of Sets -- Hypergraph Representation Learning for
Higher Order Tasks [12.28143554382742]
We develop a hypergraph neural network to learn provably expressive representations of variable sized hyperedges.
We evaluate performance on multiple real-world hypergraph datasets and demonstrate consistent, significant improvement in accuracy, over state-of-the-art models.
arXiv Detail & Related papers (2021-01-19T18:37:50Z) - HNHN: Hypergraph Networks with Hyperedge Neurons [90.15253035487314]
HNHN is a hypergraph convolution network with nonlinear activation functions applied to both hypernodes and hyperedges.
We demonstrate improved performance of HNHN in both classification accuracy and speed on real world datasets when compared to state of the art methods.
arXiv Detail & Related papers (2020-06-22T14:08:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.