Research on Joint Representation Learning Methods for Entity
Neighborhood Information and Description Information
- URL: http://arxiv.org/abs/2309.08100v1
- Date: Fri, 15 Sep 2023 01:38:07 GMT
- Title: Research on Joint Representation Learning Methods for Entity
Neighborhood Information and Description Information
- Authors: Le Xiao and Xin Shan and Yuhua Wang and Miaolei Deng
- Abstract summary: A joint learning model that combines entity neighborhood infor-mation and description information is proposed.
Experimental results demonstrate that the proposed model achieves favorable performance on the knowledge graph dataset of the pro-gramming design course.
- Score: 2.206623168926072
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To address the issue of poor embedding performance in the knowledge graph of
a programming design course, a joint represen-tation learning model that
combines entity neighborhood infor-mation and description information is
proposed. Firstly, a graph at-tention network is employed to obtain the
features of entity neigh-boring nodes, incorporating relationship features to
enrich the structural information. Next, the BERT-WWM model is utilized in
conjunction with attention mechanisms to obtain the representation of entity
description information. Finally, the final entity vector representation is
obtained by combining the vector representations of entity neighborhood
information and description information. Experimental results demonstrate that
the proposed model achieves favorable performance on the knowledge graph
dataset of the pro-gramming design course, outperforming other baseline models.
Related papers
- DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - KGLM: Integrating Knowledge Graph Structure in Language Models for Link
Prediction [0.0]
We introduce a new entity/relation embedding layer that learns to differentiate distinctive entity and relation types.
We show that further pre-training the language models with this additional embedding layer using the triples extracted from the knowledge graph, followed by the standard fine-tuning phase sets a new state-of-the-art performance for the link prediction task on the benchmark datasets.
arXiv Detail & Related papers (2022-11-04T20:38:12Z) - MINER: Improving Out-of-Vocabulary Named Entity Recognition from an
Information Theoretic Perspective [57.19660234992812]
NER model has achieved promising performance on standard NER benchmarks.
Recent studies show that previous approaches may over-rely on entity mention information, resulting in poor performance on out-of-vocabulary (OOV) entity recognition.
We propose MINER, a novel NER learning framework, to remedy this issue from an information-theoretic perspective.
arXiv Detail & Related papers (2022-04-09T05:18:20Z) - Jointly Learning Knowledge Embedding and Neighborhood Consensus with
Relational Knowledge Distillation for Entity Alignment [9.701081498310165]
Entity alignment aims at integrating heterogeneous knowledge from different knowledge graphs.
Recent studies employ embedding-based methods by first learning representation of Knowledge Graphs and then performing entity alignment.
We propose a Graph Convolutional Network (GCN) model equipped with knowledge distillation for entity alignment.
arXiv Detail & Related papers (2022-01-25T02:47:14Z) - End-to-End Hierarchical Relation Extraction for Generic Form
Understanding [0.6299766708197884]
We present a novel deep neural network to jointly perform both entity detection and link prediction.
Our model extends the Multi-stage Attentional U-Net architecture with the Part-Intensity Fields and Part-Association Fields for link prediction.
We demonstrate the effectiveness of the model on the Form Understanding in Noisy Scanned Documents dataset.
arXiv Detail & Related papers (2021-06-02T06:51:35Z) - Unified Graph Structured Models for Video Understanding [93.72081456202672]
We propose a message passing graph neural network that explicitly models relational-temporal relations.
We show how our method is able to more effectively model relationships between relevant entities in the scene.
arXiv Detail & Related papers (2021-03-29T14:37:35Z) - DisenE: Disentangling Knowledge Graph Embeddings [33.169388832519]
DisenE is an end-to-end framework to learn disentangled knowledge graph embeddings.
We introduce an attention-based mechanism that enables the model to explicitly focus on relevant components of entity embeddings according to a given relation.
arXiv Detail & Related papers (2020-10-28T03:45:19Z) - Graph Information Bottleneck [77.21967740646784]
Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features.
Inheriting from the general Information Bottleneck (IB), GIB aims to learn the minimal sufficient representation for a given task.
We show that our proposed models are more robust than state-of-the-art graph defense models.
arXiv Detail & Related papers (2020-10-24T07:13:00Z) - Bidirectional Graph Reasoning Network for Panoptic Segmentation [126.06251745669107]
We introduce a Bidirectional Graph Reasoning Network (BGRNet) to mine the intra-modular and intermodular relations within and between foreground things and background stuff classes.
BGRNet first constructs image-specific graphs in both instance and semantic segmentation branches that enable flexible reasoning at the proposal level and class level.
arXiv Detail & Related papers (2020-04-14T02:32:10Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.