Deep Hypergraph Structure Learning
- URL: http://arxiv.org/abs/2208.12547v1
- Date: Fri, 26 Aug 2022 10:00:11 GMT
- Title: Deep Hypergraph Structure Learning
- Authors: Zizhao Zhang, Yifan Feng, Shihui Ying, Yue Gao
- Abstract summary: Learning on high-order correlation has shown superiority in data representation learning, where hypergraph has been widely used in recent decades.
How to generate the hypergraph structure among data is still a challenging task.
DeepHGSL is designed to optimize the hypergraph structure for hypergraph-based representation learning.
- Score: 34.972686247703024
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning on high-order correlation has shown superiority in data
representation learning, where hypergraph has been widely used in recent
decades. The performance of hypergraph-based representation learning methods,
such as hypergraph neural networks, highly depends on the quality of the
hypergraph structure. How to generate the hypergraph structure among data is
still a challenging task. Missing and noisy data may lead to "bad connections"
in the hypergraph structure and destroy the hypergraph-based representation
learning process. Therefore, revealing the high-order structure, i.e., the
hypergraph behind the observed data, becomes an urgent but important task. To
address this issue, we design a general paradigm of deep hypergraph structure
learning, namely DeepHGSL, to optimize the hypergraph structure for
hypergraph-based representation learning. Concretely, inspired by the
information bottleneck principle for the robustness issue, we first extend it
to the hypergraph case, named by the hypergraph information bottleneck (HIB)
principle. Then, we apply this principle to guide the hypergraph structure
learning, where the HIB is introduced to construct the loss function to
minimize the noisy information in the hypergraph structure. The hypergraph
structure can be optimized and this process can be regarded as enhancing the
correct connections and weakening the wrong connections in the training phase.
Therefore, the proposed method benefits to extract more robust representations
even on a heavily noisy structure. Finally, we evaluate the model on four
benchmark datasets for representation learning. The experimental results on
both graph- and hypergraph-structured data demonstrate the effectiveness and
robustness of our method compared with other state-of-the-art methods.
Related papers
- Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Hypergraph-MLP: Learning on Hypergraphs without Message Passing [41.43504601820411]
Many hypergraph neural networks leverage message passing over hypergraph structures to enhance node representation learning.
We propose an alternative approach where we integrate the information about hypergraph structures into training supervision without explicit message passing.
Specifically, we introduce Hypergraph-MLP, a novel learning framework for hypergraph-structured data.
arXiv Detail & Related papers (2023-12-15T13:30:04Z) - Hypergraph Structure Inference From Data Under Smoothness Prior [46.568839316694515]
We propose a method to infer the probability for each potential hyperedge without labelled data as supervision.
We use this prior to derive the relation between the hypergraph structure and the node features via probabilistic modelling.
Experiments on both synthetic and real-world data demonstrate that our method can learn meaningful hypergraph structures from data more efficiently than existing hypergraph structure inference methods.
arXiv Detail & Related papers (2023-08-27T18:28:58Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Adaptive Neural Message Passing for Inductive Learning on Hypergraphs [21.606287447052757]
We present HyperMSG, a novel hypergraph learning framework.
It adapts to the data and task by learning an attention weight associated with each node's degree centrality.
It is robust and outperforms state-of-the-art hypergraph learning methods on a wide range of tasks and datasets.
arXiv Detail & Related papers (2021-09-22T12:24:02Z) - Graph Information Bottleneck for Subgraph Recognition [103.37499715761784]
We propose a framework of Graph Information Bottleneck (GIB) for the subgraph recognition problem in deep graph learning.
Under this framework, one can recognize the maximally informative yet compressive subgraph, named IB-subgraph.
We evaluate the properties of the IB-subgraph in three application scenarios: improvement of graph classification, graph interpretation and graph denoising.
arXiv Detail & Related papers (2020-10-12T09:32:20Z) - HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs [24.737560790401314]
We present HyperSAGE, a novel hypergraph learning framework that uses a two-level neural message passing strategy to accurately and efficiently propagate information through hypergraphs.
We show that HyperSAGE outperforms state-of-the-art hypergraph learning methods on representative benchmark datasets.
arXiv Detail & Related papers (2020-10-09T13:28:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.