More Data, More Relations, More Context and More Openness: A Review and
Outlook for Relation Extraction
- URL: http://arxiv.org/abs/2004.03186v3
- Date: Wed, 30 Sep 2020 09:15:29 GMT
- Title: More Data, More Relations, More Context and More Openness: A Review and
Outlook for Relation Extraction
- Authors: Xu Han, Tianyu Gao, Yankai Lin, Hao Peng, Yaoliang Yang, Chaojun Xiao,
Zhiyuan Liu, Peng Li, Maosong Sun, Jie Zhou
- Abstract summary: People have been working on extracting facts from text for years.
With explosion of Web text and emergence of new relations human knowledge is increasing drastically.
We show promising directions towards more powerful extraction (RE)
- Score: 106.28192084309617
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Relational facts are an important component of human knowledge, which are
hidden in vast amounts of text. In order to extract these facts from text,
people have been working on relation extraction (RE) for years. From early
pattern matching to current neural networks, existing RE methods have achieved
significant progress. Yet with explosion of Web text and emergence of new
relations, human knowledge is increasing drastically, and we thus require
"more" from RE: a more powerful RE system that can robustly utilize more data,
efficiently learn more relations, easily handle more complicated context, and
flexibly generalize to more open domains. In this paper, we look back at
existing RE methods, analyze key challenges we are facing nowadays, and show
promising directions towards more powerful RE. We hope our view can advance
this field and inspire more efforts in the community.
Related papers
- A Comprehensive Survey on Relation Extraction: Recent Advances and New Frontiers [76.51245425667845]
Relation extraction (RE) involves identifying the relations between entities from underlying content.
Deep neural networks have dominated the field of RE and made noticeable progress.
This survey is expected to facilitate researchers' collaborative efforts to address the challenges of real-world RE systems.
arXiv Detail & Related papers (2023-06-03T08:39:25Z) - Synergistic Interplay between Search and Large Language Models for
Information Retrieval [141.18083677333848]
InteR allows RMs to expand knowledge in queries using LLM-generated knowledge collections.
InteR achieves overall superior zero-shot retrieval performance compared to state-of-the-art methods.
arXiv Detail & Related papers (2023-05-12T11:58:15Z) - Enriching Relation Extraction with OpenIE [70.52564277675056]
Relation extraction (RE) is a sub-discipline of information extraction (IE)
In this work, we explore how recent approaches for open information extraction (OpenIE) may help to improve the task of RE.
Our experiments over two annotated corpora, KnowledgeNet and FewRel, demonstrate the improved accuracy of our enriched models.
arXiv Detail & Related papers (2022-12-19T11:26:23Z) - An Overview of Distant Supervision for Relation Extraction with a Focus
on Denoising and Pre-training Methods [0.0]
Relation Extraction is a foundational task of natural language processing.
The history of RE methods can be roughly organized into four phases: pattern-based RE, statistical-based RE, neural-based RE, and large language model-based RE.
arXiv Detail & Related papers (2022-07-17T21:02:04Z) - Should We Rely on Entity Mentions for Relation Extraction? Debiasing
Relation Extraction with Counterfactual Analysis [60.83756368501083]
We propose the CORE (Counterfactual Analysis based Relation Extraction) debiasing method for sentence-level relation extraction.
Our CORE method is model-agnostic to debias existing RE systems during inference without changing their training processes.
arXiv Detail & Related papers (2022-05-08T05:13:54Z) - Deep Neural Network Based Relation Extraction: An Overview [2.8436446946726552]
Relation Extraction (RE) plays a vital role in Natural Language Processing (NLP)
Its purpose is to identify semantic relations between entities from natural language text.
Deep Neural Networks (DNNs) are the most popular and reliable solutions for RE.
arXiv Detail & Related papers (2021-01-06T07:53:05Z) - Learning from Context or Names? An Empirical Study on Neural Relation
Extraction [112.06614505580501]
We study the effect of two main information sources in text: textual context and entity mentions (names)
We propose an entity-masked contrastive pre-training framework for relation extraction (RE)
Our framework can improve the effectiveness and robustness of neural models in different RE scenarios.
arXiv Detail & Related papers (2020-10-05T11:21:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.