How Do We Engage with Other Disciplines? A Framework to Study Meaningful Interdisciplinary Discourse in Scholarly Publications
- URL: http://arxiv.org/abs/2601.17020v1
- Date: Thu, 15 Jan 2026 23:16:49 GMT
- Title: How Do We Engage with Other Disciplines? A Framework to Study Meaningful Interdisciplinary Discourse in Scholarly Publications
- Authors: Bagyasree Sudharsan, Alexandria Leto, Maria Leonor Pacheco,
- Abstract summary: We propose a framework for the evaluation of citation engagement in interdisciplinary Natural Language Processing (NLP) publications.<n>Our approach introduces a citation purpose taxonomy tailored to interdisciplinary work, supported by an annotation study.<n>We demonstrate the utility of this framework through a thorough analysis of publications at the intersection of NLP and Computational Social Science.
- Score: 48.273878997257924
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the rising popularity of interdisciplinary work and increasing institutional incentives in this direction, there is a growing need to understand how resulting publications incorporate ideas from multiple disciplines. Existing computational approaches, such as affiliation diversity, keywords, and citation patterns, do not account for how individual citations are used to advance the citing work. Although, in line with addressing this gap, prior studies have proposed taxonomies to classify citation purpose, these frameworks are not well-suited to interdisciplinary research and do not provide quantitative measures of citation engagement quality. To address these limitations, we propose a framework for the evaluation of citation engagement in interdisciplinary Natural Language Processing (NLP) publications. Our approach introduces a citation purpose taxonomy tailored to interdisciplinary work, supported by an annotation study. We demonstrate the utility of this framework through a thorough analysis of publications at the intersection of NLP and Computational Social Science.
Related papers
- SECite: Analyzing and Summarizing Citations in Software Engineering Literature [0.13999481573773073]
SECite is a novel approach for evaluating scholarly impact through sentiment analysis of citation contexts.<n>We develop a semi-automated pipeline to extract citations referencing nine research papers.<n>We apply advanced natural language processing (NLP) techniques with unsupervised machine learning to classify these citation statements as positive or negative.
arXiv Detail & Related papers (2026-01-12T19:10:01Z) - Context-Aware Hierarchical Taxonomy Generation for Scientific Papers via LLM-Guided Multi-Aspect Clustering [59.54662810933882]
Existing taxonomy construction methods, leveraging unsupervised clustering or direct prompting of large language models, often lack coherence and granularity.<n>We propose a novel context-aware hierarchical taxonomy generation framework that integrates LLM-guided multi-aspect encoding with dynamic clustering.
arXiv Detail & Related papers (2025-09-23T15:12:58Z) - In-depth Research Impact Summarization through Fine-Grained Temporal Citation Analysis [52.42612945266194]
We propose a new task: generating nuanced, expressive, and time-aware impact summaries.<n>We show that these summaries capture both praise (confirmation citations) and critique (correction citations) through the evolution of fine-grained citation intents.
arXiv Detail & Related papers (2025-05-20T19:11:06Z) - Delineating Feminist Studies through bibliometric analysis [1.1060425537315088]
This paper proposes a novel approach for identifying gender/sex related publications scattered across diverse scientific disciplines.<n>We employ bibliometric techniques, natural language processing (NLP) and manual curation to compile a dataset of scientific publications.<n>The resulting dataset comprises over 1.9 million scientific documents published between 1668 and 2023, spanning four languages.
arXiv Detail & Related papers (2024-11-27T12:52:51Z) - The Nature of NLP: Analyzing Contributions in NLP Papers [77.31665252336157]
We propose a taxonomy of research contributions and introduce NLPContributions, a dataset of nearly $2k$ NLP research paper abstracts.<n>We show that NLP research has taken a winding path -- with the focus on language and human-centric studies being prominent in the 1970s and 80s, tapering off in the 1990s and 2000s, and starting to rise again since the late 2010s.<n>Our dataset and analyses offer a powerful lens for tracing research trends and offer potential for generating informed, data-driven literature surveys.
arXiv Detail & Related papers (2024-09-29T01:29:28Z) - Investigating Fairness Disparities in Peer Review: A Language Model
Enhanced Approach [77.61131357420201]
We conduct a thorough and rigorous study on fairness disparities in peer review with the help of large language models (LMs)
We collect, assemble, and maintain a comprehensive relational database for the International Conference on Learning Representations (ICLR) conference from 2017 to date.
We postulate and study fairness disparities on multiple protective attributes of interest, including author gender, geography, author, and institutional prestige.
arXiv Detail & Related papers (2022-11-07T16:19:42Z) - Hierarchical Interdisciplinary Topic Detection Model for Research
Proposal Classification [33.06389455749012]
We develop a deep Hierarchical Interdisciplinary Research Proposal Classification Network (HIRPCN)
We first propose a hierarchical transformer to extract the textual semantic information of proposals.
We then design an interdisciplinary graph and leverage GNNs for learning representations of each discipline.
arXiv Detail & Related papers (2022-09-16T16:59:25Z) - Revise and Resubmit: An Intertextual Model of Text-based Collaboration
in Peer Review [52.359007622096684]
Peer review is a key component of the publishing process in most fields of science.
Existing NLP studies focus on the analysis of individual texts.
editorial assistance often requires modeling interactions between pairs of texts.
arXiv Detail & Related papers (2022-04-22T16:39:38Z) - Do open citations give insights on the qualitative peer-review
evaluation in research assessments? An analysis of the Italian National
Scientific Qualification [1.911678487931003]
The Italian National Scientific Qualification (NSQ) aims at deciding whether a scholar can apply to professorial academic positions.
It makes use of bibliometrics followed by a peer-review process of candidates' CVs.
We explore whether citation-based metrics, calculated only considering open and citation data, can support the human peer-review of NDs.
arXiv Detail & Related papers (2021-03-14T14:44:45Z) - Characterizing References from Different Disciplines: A Perspective of
Citation Content Analysis [7.171503036026183]
This work takes articles in PLoS as the data and characterizes the references from different disciplines based on Citation Content Analysis (CCA)
Although most references come from Natural Science, Humanities and Social Sciences play important roles in the Introduction and Background sections of the articles.
arXiv Detail & Related papers (2021-01-19T13:30:00Z) - What's New? Summarizing Contributions in Scientific Literature [85.95906677964815]
We introduce a new task of disentangled paper summarization, which seeks to generate separate summaries for the paper contributions and the context of the work.
We extend the S2ORC corpus of academic articles by adding disentangled "contribution" and "context" reference labels.
We propose a comprehensive automatic evaluation protocol which reports the relevance, novelty, and disentanglement of generated outputs.
arXiv Detail & Related papers (2020-11-06T02:23:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.