Can we cite Wikipedia? What if Wikipedia was more reliable than its detractors ?
- URL: http://arxiv.org/abs/2509.02462v1
- Date: Tue, 02 Sep 2025 16:16:24 GMT
- Title: Can we cite Wikipedia? What if Wikipedia was more reliable than its detractors ?
- Authors: Mohamed El Louadi,
- Abstract summary: This manuscript examines the systematic rejection Wikipedia in academic settings, not to argue for its legitimacy as a source, but to demonstrate that its reliability is often underestimated traditional academic sources enjoy disproportionate credibility.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Wikipedia, a widely successful encyclopedia recognized in academic circles and used by both students and professors alike, has led educators to question whether it can be cited as an information source, given its widespread use for this very purpose. The dilemma quickly emerged: if Wikipedia has become the go-to information source for so many, why can't it be cited? If consulting and using Wikipedia as a source of information is permitted, why does it become controversial the moment one attempts to cite it? This manuscript examines the systematic rejection of Wikipedia in academic settings, not to argue for its legitimacy as a source, but to demonstrate that its reliability is often underestimated while traditional academic sources enjoy disproportionate credibility, despite their increasingly apparent shortcomings. The central thesis posits that Wikipedia's rejection stems from an outdated epistemological bias that overlooks both the project's verification mechanisms and the structural crises affecting scientific publishing.
Related papers
- Wiki Live Challenge: Challenging Deep Research Agents with Expert-Level Wikipedia Articles [56.724847946825285]
We introduce Wiki Live Challenge (WLC), a live benchmark that leverages the newest Wikipedia Good Articles (GAs) as expert-level references.<n>We propose Wiki Eval, a comprehensive evaluation framework comprising a fine-grained evaluation method with 39 criteria for writing quality and rigorous metrics for factual verifiability.
arXiv Detail & Related papers (2026-02-02T03:30:13Z) - How Grounded is Wikipedia? A Study on Structured Evidential Support [27.55382517488165]
We show that roughly 20% of claims in Wikipedia *lead* sections are unsupported by the article body.<n>We also show that recovery of complex grounding evidence for claims that *are* supported remains a challenge for standard retrieval methods.
arXiv Detail & Related papers (2025-06-14T21:40:14Z) - Web2Wiki: Characterizing Wikipedia Linking Across the Web [19.00204665059246]
We identify over 90 million Wikipedia links spanning 1.68% of Web domains.<n>Wikipedia is most frequently cited by news and science websites for informational purposes.<n>Most links serve as explanatory references rather than as evidence or attribution.
arXiv Detail & Related papers (2025-05-17T00:52:24Z) - Can Community Notes Replace Professional Fact-Checkers? [49.5332225129956]
Policy changes by Twitter/X and Meta signal a shift away from partnerships with fact-checking organisations.<n>Our analysis reveals that community notes cite fact-checking sources up to five times more than previously reported.<n>Our results show that successful community moderation relies on professional fact-checking and highlight how citizen and professional fact-checking are deeply intertwined.
arXiv Detail & Related papers (2025-02-19T22:26:39Z) - Forgotten Knowledge: Examining the Citational Amnesia in NLP [63.13508571014673]
We show how far back in time do we tend to go to cite papers? How has that changed over time, and what factors correlate with this citational attention/amnesia?
We show that around 62% of cited papers are from the immediate five years prior to publication, whereas only about 17% are more than ten years old.
We show that the median age and age diversity of cited papers were steadily increasing from 1990 to 2014, but since then, the trend has reversed, and current NLP papers have an all-time low temporal citation diversity.
arXiv Detail & Related papers (2023-05-29T18:30:34Z) - Improving Wikipedia Verifiability with AI [116.69749668874493]
We develop a neural network based system, called Side, to identify Wikipedia citations that are unlikely to support their claims.
Our first citation recommendation collects over 60% more preferences than existing Wikipedia citations for the same top 10% most likely unverifiable claims.
Our results indicate that an AI-based system could be used, in tandem with humans, to improve the verifiability of Wikipedia.
arXiv Detail & Related papers (2022-07-08T15:23:29Z) - Surfer100: Generating Surveys From Web Resources on Wikipedia-style [49.23675182917996]
We show that recent advances in pretrained language modeling can be combined for a two-stage extractive and abstractive approach for Wikipedia lead paragraph generation.
We extend this approach to generate longer Wikipedia-style summaries with sections and examine how such methods struggle in this application through detailed studies with 100 reference human-collected surveys.
arXiv Detail & Related papers (2021-12-13T02:18:01Z) - A Map of Science in Wikipedia [0.22843885788439797]
We map the relationship between Wikipedia articles and scientific journal articles.
Most journal articles cited from Wikipedia belong to STEM fields, in particular biology and medicine.
Wikipedia's biographies play an important role in connecting STEM fields with the humanities, especially history.
arXiv Detail & Related papers (2021-10-26T15:44:32Z) - Multiple Texts as a Limiting Factor in Online Learning: Quantifying
(Dis-)similarities of Knowledge Networks across Languages [60.00219873112454]
We investigate the hypothesis that the extent to which one obtains information on a given topic through Wikipedia depends on the language in which it is consulted.
Since Wikipedia is a central part of the web-based information landscape, this indicates a language-related, linguistic bias.
The article builds a bridge between reading research, educational science, Wikipedia research and computational linguistics.
arXiv Detail & Related papers (2020-08-05T11:11:55Z) - Quantifying Engagement with Citations on Wikipedia [13.703047949952852]
One in 300 page views results in a reference click.
Clicks occur more frequently on shorter pages and on pages of lower quality.
Recent content, open access sources and references about life events are particularly popular.
arXiv Detail & Related papers (2020-01-23T15:52:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.