Automatic Related Work Generation: A Meta Study
- URL: http://arxiv.org/abs/2201.01880v1
- Date: Thu, 6 Jan 2022 01:16:38 GMT
- Title: Automatic Related Work Generation: A Meta Study
- Authors: Xiangci Li and Jessica Ouyang
- Abstract summary: In natural language processing, a literature review is usually conducted under the "Related Work" section.
The task of automatic related work generation aims to automatically generate the "Related Work" section.
We conduct a meta-study to compare the existing literature on related work generation from the perspectives of problem formulation, dataset collection, methodological approach, performance evaluation, and future prospects.
- Score: 5.025654873456755
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Academic research is an exploration activity to solve problems that have
never been resolved before. By this nature, each academic research work is
required to perform a literature review to distinguish its novelties that have
not been addressed by prior works. In natural language processing, this
literature review is usually conducted under the "Related Work" section. The
task of automatic related work generation aims to automatically generate the
"Related Work" section given the rest of the research paper and a list of cited
papers. Although this task was proposed over 10 years ago, it received little
attention until very recently, when it was cast as a variant of the scientific
multi-document summarization problem. However, even today, the problems of
automatic related work and citation text generation are not yet standardized.
In this survey, we conduct a meta-study to compare the existing literature on
related work generation from the perspectives of problem formulation, dataset
collection, methodological approach, performance evaluation, and future
prospects to provide the reader insight into the progress of the
state-of-the-art studies, as well as and how future studies can be conducted.
We also survey relevant fields of study that we suggest future work to consider
integrating.
Related papers
- LLAssist: Simple Tools for Automating Literature Review Using Large Language Models [0.0]
LLAssist is an open-source tool designed to streamline literature reviews in academic research.
It uses Large Language Models (LLMs) and Natural Language Processing (NLP) techniques to automate key aspects of the review process.
arXiv Detail & Related papers (2024-07-19T02:48:54Z) - Retrieval-Enhanced Machine Learning: Synthesis and Opportunities [60.34182805429511]
Retrieval-enhancement can be extended to a broader spectrum of machine learning (ML)
This work introduces a formal framework of this paradigm, Retrieval-Enhanced Machine Learning (REML), by synthesizing the literature in various domains in ML with consistent notations which is missing from the current literature.
The goal of this work is to equip researchers across various disciplines with a comprehensive, formally structured framework of retrieval-enhanced models, thereby fostering interdisciplinary future research.
arXiv Detail & Related papers (2024-07-17T20:01:21Z) - A Survey of Imitation Learning Methods, Environments and Metrics [9.967130899041651]
Imitation learning is an approach in which an agent learns how to execute a task by trying to mimic how one or more teachers perform it.
This learning approach offers a compromise between the time it takes to learn a new task and the effort needed to collect teacher samples for the agent.
The field of imitation learning has received much attention from researchers in recent years, resulting in many new methods and applications.
arXiv Detail & Related papers (2024-04-30T11:13:23Z) - Related Work and Citation Text Generation: A Survey [12.039469573641217]
literature review writing makes automatic related work generation academically and computationally interesting.
RWG is an excellent test bed for examining the capability of SOTA natural language processing (NLP) models.
Since the initial proposal of the RWG task, its popularity has waxed and waned, following the capabilities of mainstream NLP approaches.
arXiv Detail & Related papers (2024-04-17T17:37:30Z) - Target-aware Abstractive Related Work Generation with Contrastive
Learning [48.02845973891943]
The related work section is an important component of a scientific paper, which highlights the contribution of the target paper in the context of the reference papers.
Most of the existing related work section generation methods rely on extracting off-the-shelf sentences.
We propose an abstractive target-aware related work generator (TAG), which can generate related work sections consisting of new sentences.
arXiv Detail & Related papers (2022-05-26T13:20:51Z) - CORWA: A Citation-Oriented Related Work Annotation Dataset [4.740962650068886]
In natural language processing, literature reviews are usually conducted under the "Related Work" section.
We train a strong baseline model that automatically tags the CORWA labels on massive unlabeled related work section texts.
We suggest a novel framework for human-in-the-loop, iterative, abstractive related work generation.
arXiv Detail & Related papers (2022-05-07T00:23:46Z) - A Survey on Retrieval-Augmented Text Generation [53.04991859796971]
Retrieval-augmented text generation has remarkable advantages and has achieved state-of-the-art performance in many NLP tasks.
It firstly highlights the generic paradigm of retrieval-augmented generation, and then it reviews notable approaches according to different tasks.
arXiv Detail & Related papers (2022-02-02T16:18:41Z) - Positioning yourself in the maze of Neural Text Generation: A
Task-Agnostic Survey [54.34370423151014]
This paper surveys the components of modeling approaches relaying task impacts across various generation tasks such as storytelling, summarization, translation etc.
We present an abstraction of the imperative techniques with respect to learning paradigms, pretraining, modeling approaches, decoding and the key challenges outstanding in the field in each of them.
arXiv Detail & Related papers (2020-10-14T17:54:42Z) - From Standard Summarization to New Tasks and Beyond: Summarization with
Manifold Information [77.89755281215079]
Text summarization is the research area aiming at creating a short and condensed version of the original document.
In real-world applications, most of the data is not in a plain text format.
This paper focuses on the survey of these new summarization tasks and approaches in the real-world application.
arXiv Detail & Related papers (2020-05-10T14:59:36Z) - Explaining Relationships Between Scientific Documents [55.23390424044378]
We address the task of explaining relationships between two scientific documents using natural language text.
In this paper we establish a dataset of 622K examples from 154K documents.
arXiv Detail & Related papers (2020-02-02T03:54:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.