Adaptive Hypergraph Network for Trust Prediction
- URL: http://arxiv.org/abs/2402.05154v1
- Date: Wed, 7 Feb 2024 15:21:18 GMT
- Title: Adaptive Hypergraph Network for Trust Prediction
- Authors: Rongwei Xu, Guanfeng Liu, Yan Wang, Xuyun Zhang, Kai Zheng, Xiaofang
Zhou
- Abstract summary: Hypergraphs offer a flexible approach to modeling complex high-order correlations.
Most hypergraph-based methods are generic and cannot be well applied to the trust prediction task.
We propose an Adaptive Hypergraph Network for Trust Prediction (AHNTP) to improve trust prediction accuracy by using higher-order correlations.
- Score: 23.219647971257725
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Trust plays an essential role in an individual's decision-making. Traditional
trust prediction models rely on pairwise correlations to infer potential
relationships between users. However, in the real world, interactions between
users are usually complicated rather than pairwise only. Hypergraphs offer a
flexible approach to modeling these complex high-order correlations (not just
pairwise connections), since hypergraphs can leverage hyperedeges to link more
than two nodes. However, most hypergraph-based methods are generic and cannot
be well applied to the trust prediction task. In this paper, we propose an
Adaptive Hypergraph Network for Trust Prediction (AHNTP), a novel approach that
improves trust prediction accuracy by using higher-order correlations. AHNTP
utilizes Motif-based PageRank to capture high-order social influence
information. In addition, it constructs hypergroups from both node-level and
structure-level attributes to incorporate complex correlation information.
Furthermore, AHNTP leverages adaptive hypergraph Graph Convolutional Network
(GCN) layers and multilayer perceptrons (MLPs) to generate comprehensive user
embeddings, facilitating trust relationship prediction. To enhance model
generalization and robustness, we introduce a novel supervised contrastive
learning loss for optimization. Extensive experiments demonstrate the
superiority of our model over the state-of-the-art approaches in terms of trust
prediction accuracy. The source code of this work can be accessed via
https://github.com/Sherry-XU1995/AHNTP.
Related papers
- SPHINX: Structural Prediction using Hypergraph Inference Network [19.853413818941608]
We introduce Structural Prediction using Hypergraph Inference Network (SPHINX), a model that learns to infer a latent hypergraph structure in an unsupervised way.
We show that the recent advancement in k-subset sampling represents a suitable tool for producing discrete hypergraph structures.
The resulting model can generate the higher-order structure necessary for any modern hypergraph neural network.
arXiv Detail & Related papers (2024-10-04T07:49:57Z) - Federated Hypergraph Learning with Hyperedge Completion [6.295242666794106]
Hypergraph neural networks enhance conventional graph neural networks by capturing high-order relationships among nodes.
We propose FedHGN, a novel algorithm for federated hypergraph learning.
Our algorithm utilizes subgraphs of a hypergraph stored on distributed devices to train local HGNN models in a federated manner.
arXiv Detail & Related papers (2024-08-09T16:31:41Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Self-Supervised Pretraining for Heterogeneous Hypergraph Neural Networks [9.987252149421982]
We present a novel self-supervised pretraining framework for heterogeneous HyperGNNs.
Our method is able to effectively capture higher-order relations among entities in the data in a self-supervised manner.
Our experiments show that our proposed framework consistently outperforms state-of-the-art baselines in various downstream tasks.
arXiv Detail & Related papers (2023-11-19T16:34:56Z) - Enhancing Hyperedge Prediction with Context-Aware Self-Supervised
Learning [64.46188414653204]
We propose a novel hyperedge prediction framework (CASH)
CASH employs context-aware node aggregation to capture complex relations among nodes in each hyperedge for (C1) and (2) self-supervised contrastive learning in the context of hyperedge prediction to enhance hypergraph representations for (C2)
Experiments on six real-world hypergraphs reveal that CASH consistently outperforms all competing methods in terms of the accuracy in hyperedge prediction.
arXiv Detail & Related papers (2023-09-11T20:06:00Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Ordinal Graph Gamma Belief Network for Social Recommender Systems [54.9487910312535]
We develop a hierarchical Bayesian model termed ordinal graph factor analysis (OGFA), which jointly models user-item and user-user interactions.
OGFA not only achieves good recommendation performance, but also extracts interpretable latent factors corresponding to representative user preferences.
We extend OGFA to ordinal graph gamma belief network, which is a multi-stochastic-layer deep probabilistic model.
arXiv Detail & Related papers (2022-09-12T09:19:22Z) - Equivariant Hypergraph Diffusion Neural Operators [81.32770440890303]
Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data.
This work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators.
We evaluate ED-HNN for node classification on nine real-world hypergraph datasets.
arXiv Detail & Related papers (2022-07-14T06:17:00Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-12T02:07:07Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-10T12:37:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.