Deep Neural Approaches to Relation Triplets Extraction: A Comprehensive
Survey
- URL: http://arxiv.org/abs/2103.16929v1
- Date: Wed, 31 Mar 2021 09:27:15 GMT
- Title: Deep Neural Approaches to Relation Triplets Extraction: A Comprehensive
Survey
- Authors: Tapas Nayak and Navonil Majumder and Pawan Goyal and Soujanya Poria
- Abstract summary: We focus on relation extraction using deep neural networks on publicly available datasets.
We cover sentence-level relation extraction to document-level relation extraction, pipeline-based approaches to joint extraction approaches, annotated datasets to distantly supervised datasets.
Regarding neural architectures, we cover convolutional models, recurrent network models, attention network models, and graph convolutional models in this survey.
- Score: 22.586079965178975
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, with the advances made in continuous representation of words (word
embeddings) and deep neural architectures, many research works are published in
the area of relation extraction and it is very difficult to keep track of so
many papers. To help future research, we present a comprehensive review of the
recently published research works in relation extraction. We mostly focus on
relation extraction using deep neural networks which have achieved
state-of-the-art performance on publicly available datasets. In this survey, we
cover sentence-level relation extraction to document-level relation extraction,
pipeline-based approaches to joint extraction approaches, annotated datasets to
distantly supervised datasets along with few very recent research directions
such as zero-shot or few-shot relation extraction, noise mitigation in
distantly supervised datasets. Regarding neural architectures, we cover
convolutional models, recurrent network models, attention network models, and
graph convolutional models in this survey.
Related papers
- Maximizing Relation Extraction Potential: A Data-Centric Study to Unveil Challenges and Opportunities [3.8087810875611896]
This paper investigates the possible data-centric characteristics that impede neural relation extraction.
It emphasizes pivotal issues, such as contextual ambiguity, correlating relations, long-tail data, and fine-grained relation distributions.
It sets a marker for future directions to alleviate these issues, thereby proving to be a critical resource for novice and advanced researchers.
arXiv Detail & Related papers (2024-09-07T23:40:47Z) - State-Space Modeling in Long Sequence Processing: A Survey on Recurrence in the Transformer Era [59.279784235147254]
This survey provides an in-depth summary of the latest approaches that are based on recurrent models for sequential data processing.
The emerging picture suggests that there is room for thinking of novel routes, constituted by learning algorithms which depart from the standard Backpropagation Through Time.
arXiv Detail & Related papers (2024-06-13T12:51:22Z) - Automatic Discovery of Visual Circuits [66.99553804855931]
We explore scalable methods for extracting the subgraph of a vision model's computational graph that underlies recognition of a specific visual concept.
We find that our approach extracts circuits that causally affect model output, and that editing these circuits can defend large pretrained models from adversarial attacks.
arXiv Detail & Related papers (2024-04-22T17:00:57Z) - Relational Extraction on Wikipedia Tables using Convolutional and Memory
Networks [6.200672130699805]
Relation extraction (RE) is the task of extracting relations between entities in text.
We introduce a new model consisting of Convolutional Neural Network (CNN) and Bidirectional-Long Short Term Memory (BiLSTM) network to encode entities.
arXiv Detail & Related papers (2023-07-11T22:36:47Z) - Temporal Relevance Analysis for Video Action Models [70.39411261685963]
We first propose a new approach to quantify the temporal relationships between frames captured by CNN-based action models.
We then conduct comprehensive experiments and in-depth analysis to provide a better understanding of how temporal modeling is affected.
arXiv Detail & Related papers (2022-04-25T19:06:48Z) - Deep Learning Schema-based Event Extraction: Literature Review and
Current Trends [60.29289298349322]
Event extraction technology based on deep learning has become a research hotspot.
This paper fills the gap by reviewing the state-of-the-art approaches, focusing on deep learning-based models.
arXiv Detail & Related papers (2021-07-05T16:32:45Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - WebRED: Effective Pretraining And Finetuning For Relation Extraction On
The Web [4.702325864333419]
WebRED is a strongly-supervised human annotated dataset for extracting relationships from text found on the World Wide Web.
We show that combining pre-training on a large weakly supervised dataset with fine-tuning on a small strongly-supervised dataset leads to better relation extraction performance.
arXiv Detail & Related papers (2021-02-18T23:56:12Z) - RH-Net: Improving Neural Relation Extraction via Reinforcement Learning
and Hierarchical Relational Searching [2.1828601975620257]
We propose a novel framework named RH-Net, which utilizes Reinforcement learning and Hierarchical relational searching module to improve relation extraction.
We then propose the hierarchical relational searching module to share the semantics from correlative instances between data-rich and data-poor classes.
arXiv Detail & Related papers (2020-10-27T12:50:27Z) - Relation-Guided Representation Learning [53.60351496449232]
We propose a new representation learning method that explicitly models and leverages sample relations.
Our framework well preserves the relations between samples.
By seeking to embed samples into subspace, we show that our method can address the large-scale and out-of-sample problem.
arXiv Detail & Related papers (2020-07-11T10:57:45Z) - Exploration and Discovery of the COVID-19 Literature through Semantic
Visualization [9.687961759392559]
We are developing semantic visualization techniques to enhance exploration and enable discovery over large datasets of relations.
Our hope is that this will enable the discovery of novel inferences over relations in complex data that otherwise would go unnoticed.
arXiv Detail & Related papers (2020-07-03T16:40:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.