BSAL: A Framework of Bi-component Structure and Attribute Learning for
Link Prediction
- URL: http://arxiv.org/abs/2204.09508v1
- Date: Mon, 18 Apr 2022 03:12:13 GMT
- Title: BSAL: A Framework of Bi-component Structure and Attribute Learning for
Link Prediction
- Authors: Bisheng Li, Min Zhou, Shengzhong Zhang, Menglin Yang, Defu Lian,
Zengfeng Huang
- Abstract summary: We propose a bicomponent structural and attribute learning framework (BSAL) that is designed to adaptively leverage information from topology and feature spaces.
BSAL constructs a semantic topology via the node attributes and then gets the embeddings regarding the semantic view.
It provides a flexible and easy-to-implement solution to adaptively incorporate the information carried by the node attributes.
- Score: 33.488229191263564
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Given the ubiquitous existence of graph-structured data, learning the
representations of nodes for the downstream tasks ranging from node
classification, link prediction to graph classification is of crucial
importance. Regarding missing link inference of diverse networks, we revisit
the link prediction techniques and identify the importance of both the
structural and attribute information. However, the available techniques either
heavily count on the network topology which is spurious in practice or cannot
integrate graph topology and features properly. To bridge the gap, we propose a
bicomponent structural and attribute learning framework (BSAL) that is designed
to adaptively leverage information from topology and feature spaces.
Specifically, BSAL constructs a semantic topology via the node attributes and
then gets the embeddings regarding the semantic view, which provides a flexible
and easy-to-implement solution to adaptively incorporate the information
carried by the node attributes. Then the semantic embedding together with
topology embedding is fused together using an attention mechanism for the final
prediction. Extensive experiments show the superior performance of our proposal
and it significantly outperforms baselines on diverse research benchmarks.
Related papers
- On the Impact of Feature Heterophily on Link Prediction with Graph Neural Networks [12.26334940017605]
Heterophily, or the tendency of connected nodes in networks to have different class labels or dissimilar features, has been identified as challenging for many Graph Neural Network (GNN) models.
We focus on the link prediction task and systematically analyze the impact of heterophily in node features on GNN performance.
arXiv Detail & Related papers (2024-09-26T02:19:48Z) - Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting [50.181824673039436]
We propose a Graph Structure Self-Contrasting (GSSC) framework that learns graph structural information without message passing.
The proposed framework is based purely on Multi-Layer Perceptrons (MLPs), where the structural information is only implicitly incorporated as prior knowledge.
It first applies structural sparsification to remove potentially uninformative or noisy edges in the neighborhood, and then performs structural self-contrasting in the sparsified neighborhood to learn robust node representations.
arXiv Detail & Related papers (2024-09-09T12:56:02Z) - A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - Node Classification via Semantic-Structural Attention-Enhanced Graph Convolutional Networks [0.9463895540925061]
We introduce the semantic-structural attention-enhanced graph convolutional network (SSA-GCN)
It not only models the graph structure but also extracts generalized unsupervised features to enhance classification performance.
Our experiments on the Cora and CiteSeer datasets demonstrate the performance improvements achieved by our proposed method.
arXiv Detail & Related papers (2024-03-24T06:28:54Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Revisiting Link Prediction: A Data Perspective [59.296773787387224]
Link prediction, a fundamental task on graphs, has proven indispensable in various applications, e.g., friend recommendation, protein analysis, and drug interaction prediction.
Evidence in existing literature underscores the absence of a universally best algorithm suitable for all datasets.
We recognize three fundamental factors critical to link prediction: local structural proximity, global structural proximity, and feature proximity.
arXiv Detail & Related papers (2023-10-01T21:09:59Z) - GAGE: Geometry Preserving Attributed Graph Embeddings [34.25102483600248]
This paper presents a novel approach for node embedding in attributed networks.
It preserves the distances of both the connections and the attributes.
An effective and lightweight algorithm is developed to tackle the learning task.
arXiv Detail & Related papers (2020-11-03T02:07:02Z) - Graph Information Bottleneck [77.21967740646784]
Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features.
Inheriting from the general Information Bottleneck (IB), GIB aims to learn the minimal sufficient representation for a given task.
We show that our proposed models are more robust than state-of-the-art graph defense models.
arXiv Detail & Related papers (2020-10-24T07:13:00Z) - AM-GCN: Adaptive Multi-channel Graph Convolutional Networks [85.0332394224503]
We study whether Graph Convolutional Networks (GCNs) can optimally integrate node features and topological structures in a complex graph with rich information.
We propose an adaptive multi-channel graph convolutional networks for semi-supervised classification (AM-GCN)
Our experiments show that AM-GCN extracts the most correlated information from both node features and topological structures substantially.
arXiv Detail & Related papers (2020-07-05T08:16:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.