Comparative Analysis of Contextual Relation Extraction based on Deep
Learning Models
- URL: http://arxiv.org/abs/2309.06814v1
- Date: Wed, 13 Sep 2023 09:05:09 GMT
- Title: Comparative Analysis of Contextual Relation Extraction based on Deep
Learning Models
- Authors: R.Priyadharshini, G.Jeyakodi, P.Shanthi Bala
- Abstract summary: An efficient and accurate CRE system is essential for creating domain knowledge in the biomedical industry.
Deep learning techniques have been used to identify the appropriate semantic relation based on the context from multiple sentences.
This paper explores the analysis of various deep learning models that are used for relation extraction.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Contextual Relation Extraction (CRE) is mainly used for constructing a
knowledge graph with a help of ontology. It performs various tasks such as
semantic search, query answering, and textual entailment. Relation extraction
identifies the entities from raw texts and the relations among them. An
efficient and accurate CRE system is essential for creating domain knowledge in
the biomedical industry. Existing Machine Learning and Natural Language
Processing (NLP) techniques are not suitable to predict complex relations from
sentences that consist of more than two relations and unspecified entities
efficiently. In this work, deep learning techniques have been used to identify
the appropriate semantic relation based on the context from multiple sentences.
Even though various machine learning models have been used for relation
extraction, they provide better results only for binary relations, i.e.,
relations occurred exactly between the two entities in a sentence. Machine
learning models are not suited for complex sentences that consist of the words
that have various meanings. To address these issues, hybrid deep learning
models have been used to extract the relations from complex sentence
effectively. This paper explores the analysis of various deep learning models
that are used for relation extraction.
Related papers
- Modeling Comparative Logical Relation with Contrastive Learning for Text Generation [43.814189025925096]
We introduce a new D2T task named comparative logical relation generation (CLRG)
We propose a Comparative Logic (CoLo) based text generation method, which generates texts following specific comparative logical relations with contrastive learning.
Our method achieves impressive performance in both automatic and human evaluations.
arXiv Detail & Related papers (2024-06-13T13:25:50Z) - Relational Sentence Embedding for Flexible Semantic Matching [86.21393054423355]
We present Sentence Embedding (RSE), a new paradigm to discover further the potential of sentence embeddings.
RSE is effective and flexible in modeling sentence relations and outperforms a series of state-of-the-art embedding methods.
arXiv Detail & Related papers (2022-12-17T05:25:17Z) - RelationPrompt: Leveraging Prompts to Generate Synthetic Data for
Zero-Shot Relation Triplet Extraction [65.4337085607711]
We introduce the task setting of Zero-Shot Relation Triplet Extraction (ZeroRTE)
Given an input sentence, each extracted triplet consists of the head entity, relation label, and tail entity where the relation label is not seen at the training stage.
We propose to synthesize relation examples by prompting language models to generate structured texts.
arXiv Detail & Related papers (2022-03-17T05:55:14Z) - Improving Sentence-Level Relation Extraction through Curriculum Learning [7.117139527865022]
We propose a curriculum learning-based relation extraction model that split data by difficulty and utilize it for learning.
In the experiments with the representative sentence-level relation extraction datasets, TACRED and Re-TACRED, the proposed method showed good performances.
arXiv Detail & Related papers (2021-07-20T08:44:40Z) - R$^2$-Net: Relation of Relation Learning Network for Sentence Semantic
Matching [58.72111690643359]
We propose a Relation of Relation Learning Network (R2-Net) for sentence semantic matching.
We first employ BERT to encode the input sentences from a global perspective.
Then a CNN-based encoder is designed to capture keywords and phrase information from a local perspective.
To fully leverage labels for better relation information extraction, we introduce a self-supervised relation of relation classification task.
arXiv Detail & Related papers (2020-12-16T13:11:30Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - Learning Informative Representations of Biomedical Relations with Latent
Variable Models [2.4366811507669115]
We propose a latent variable model with an arbitrarily flexible distribution to represent the relation between an entity pair.
We demonstrate that our model achieves results competitive with strong baselines for both tasks while having fewer parameters and being significantly faster to train.
arXiv Detail & Related papers (2020-11-20T08:56:31Z) - A Comparative Study on Structural and Semantic Properties of Sentence
Embeddings [77.34726150561087]
We propose a set of experiments using a widely-used large-scale data set for relation extraction.
We show that different embedding spaces have different degrees of strength for the structural and semantic properties.
These results provide useful information for developing embedding-based relation extraction methods.
arXiv Detail & Related papers (2020-09-23T15:45:32Z) - Inferential Text Generation with Multiple Knowledge Sources and
Meta-Learning [117.23425857240679]
We study the problem of generating inferential texts of events for a variety of commonsense like textitif-else relations.
Existing approaches typically use limited evidence from training examples and learn for each relation individually.
In this work, we use multiple knowledge sources as fuels for the model.
arXiv Detail & Related papers (2020-04-07T01:49:18Z) - A logic-based relational learning approach to relation extraction: The
OntoILPER system [0.9176056742068812]
We present OntoILPER, a logic-based relational learning approach to Relation Extraction.
OntoILPER takes profit of a rich relational representation of examples, which can alleviate the drawbacks.
The proposed relational approach seems to be more suitable for Relation Extraction than statistical ones.
arXiv Detail & Related papers (2020-01-13T12:47:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.