What's New? Summarizing Contributions in Scientific Literature
- URL: http://arxiv.org/abs/2011.03161v2
- Date: Mon, 9 Nov 2020 16:16:45 GMT
- Title: What's New? Summarizing Contributions in Scientific Literature
- Authors: Hiroaki Hayashi, Wojciech Kry\'sci\'nski, Bryan McCann, Nazneen
Rajani, Caiming Xiong
- Abstract summary: We introduce a new task of disentangled paper summarization, which seeks to generate separate summaries for the paper contributions and the context of the work.
We extend the S2ORC corpus of academic articles by adding disentangled "contribution" and "context" reference labels.
We propose a comprehensive automatic evaluation protocol which reports the relevance, novelty, and disentanglement of generated outputs.
- Score: 85.95906677964815
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With thousands of academic articles shared on a daily basis, it has become
increasingly difficult to keep up with the latest scientific findings. To
overcome this problem, we introduce a new task of disentangled paper
summarization, which seeks to generate separate summaries for the paper
contributions and the context of the work, making it easier to identify the key
findings shared in articles. For this purpose, we extend the S2ORC corpus of
academic articles, which spans a diverse set of domains ranging from economics
to psychology, by adding disentangled "contribution" and "context" reference
labels. Together with the dataset, we introduce and analyze three baseline
approaches: 1) a unified model controlled by input code prefixes, 2) a model
with separate generation heads specialized in generating the disentangled
outputs, and 3) a training strategy that guides the model using additional
supervision coming from inbound and outbound citations. We also propose a
comprehensive automatic evaluation protocol which reports the relevance,
novelty, and disentanglement of generated outputs. Through a human study
involving expert annotators, we show that in 79%, of cases our new task is
considered more helpful than traditional scientific paper summarization.
Related papers
- P^3SUM: Preserving Author's Perspective in News Summarization with Diffusion Language Models [57.571395694391654]
We find that existing approaches alter the political opinions and stances of news articles in more than 50% of summaries.
We propose P3SUM, a diffusion model-based summarization approach controlled by political perspective classifiers.
Experiments on three news summarization datasets demonstrate that P3SUM outperforms state-of-the-art summarization systems.
arXiv Detail & Related papers (2023-11-16T10:14:28Z) - NLPeer: A Unified Resource for the Computational Study of Peer Review [58.71736531356398]
We introduce NLPeer -- the first ethically sourced multidomain corpus of more than 5k papers and 11k review reports from five different venues.
We augment previous peer review datasets to include parsed and structured paper representations, rich metadata and versioning information.
Our work paves the path towards systematic, multi-faceted, evidence-based study of peer review in NLP and beyond.
arXiv Detail & Related papers (2022-11-12T12:29:38Z) - Target-aware Abstractive Related Work Generation with Contrastive
Learning [48.02845973891943]
The related work section is an important component of a scientific paper, which highlights the contribution of the target paper in the context of the reference papers.
Most of the existing related work section generation methods rely on extracting off-the-shelf sentences.
We propose an abstractive target-aware related work generator (TAG), which can generate related work sections consisting of new sentences.
arXiv Detail & Related papers (2022-05-26T13:20:51Z) - Revise and Resubmit: An Intertextual Model of Text-based Collaboration
in Peer Review [52.359007622096684]
Peer review is a key component of the publishing process in most fields of science.
Existing NLP studies focus on the analysis of individual texts.
editorial assistance often requires modeling interactions between pairs of texts.
arXiv Detail & Related papers (2022-04-22T16:39:38Z) - Enhancing Identification of Structure Function of Academic Articles
Using Contextual Information [6.28532577139029]
This paper takes articles of the ACL conference as the corpus to identify the structure function of academic articles.
We employ the traditional machine learning models and deep learning models to construct the classifiers based on various feature input.
Inspired by (2), this paper introduces contextual information into the deep learning models and achieved significant results.
arXiv Detail & Related papers (2021-11-28T11:21:21Z) - Enhancing Scientific Papers Summarization with Citation Graph [78.65955304229863]
We redefine the task of scientific papers summarization by utilizing their citation graph.
We construct a novel scientific papers summarization dataset Semantic Scholar Network (SSN) which contains 141K research papers in different domains.
Our model can achieve competitive performance when compared with the pretrained models.
arXiv Detail & Related papers (2021-04-07T11:13:35Z) - CORAL: COde RepresentAtion Learning with Weakly-Supervised Transformers
for Analyzing Data Analysis [33.190021245507445]
Large scale analysis of source code, and in particular scientific source code, holds the promise of better understanding the data science process.
We propose a novel weakly supervised transformer-based architecture for computing joint representations of code from both abstract syntax trees and surrounding natural language comments.
We show that our model, leveraging only easily-available weak supervision, achieves a 38% increase in accuracy over expert-supplieds and outperforms a suite of baselines.
arXiv Detail & Related papers (2020-08-28T19:57:49Z) - Two Huge Title and Keyword Generation Corpora of Research Articles [0.0]
We introduce two huge datasets for text summarization (OAGSX) and keyword generation (OAGKX) research.
The data were retrieved from the Open Academic Graph which is a network of research profiles and publications.
We would like to apply topic modeling on the two sets to derive subsets of research articles from more specific disciplines.
arXiv Detail & Related papers (2020-02-11T21:17:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.