Automated Fact-Checking for Assisting Human Fact-Checkers
- URL: http://arxiv.org/abs/2103.07769v1
- Date: Sat, 13 Mar 2021 18:29:14 GMT
- Title: Automated Fact-Checking for Assisting Human Fact-Checkers
- Authors: Preslav Nakov, David Corney, Maram Hasanain, Firoj Alam, Tamer
Elsayed, Alberto Barr\'on-Cede\~no, Paolo Papotti, Shaden Shaar, Giovanni Da
San Martino
- Abstract summary: misuse of media to spread inaccurate or misleading claims has led to the modern incarnation of the fact-checker.
We survey the available intelligent technologies that can support the human expert in the different steps of her fact-checking endeavor.
In each case, we pay attention to the challenges in future work and the potential impact on real-world fact-checking.
- Score: 29.53133956640405
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The reporting and analysis of current events around the globe has expanded
from professional, editor-lead journalism all the way to citizen journalism.
Politicians and other key players enjoy direct access to their audiences
through social media, bypassing the filters of official cables or traditional
media. However, the multiple advantages of free speech and direct communication
are dimmed by the misuse of the media to spread inaccurate or misleading
claims. These phenomena have led to the modern incarnation of the fact-checker
-- a professional whose main aim is to examine claims using available evidence
to assess their veracity. As in other text forensics tasks, the amount of
information available makes the work of the fact-checker more difficult. With
this in mind, starting from the perspective of the professional fact-checker,
we survey the available intelligent technologies that can support the human
expert in the different steps of her fact-checking endeavor. These include
identifying claims worth fact-checking; detecting relevant previously
fact-checked claims; retrieving relevant evidence to fact-check a claim; and
actually verifying a claim. In each case, we pay attention to the challenges in
future work and the potential impact on real-world fact-checking.
Related papers
- Grounding Fallacies Misrepresenting Scientific Publications in Evidence [84.32990746227385]
We introduce MissciPlus, an extension of the fallacy detection dataset Missci.
MissciPlus builds on Missci by grounding the applied fallacies in real-world passages from misrepresented studies.
MissciPlus is the first logical fallacy dataset which pairs the real-world misrepresented evidence with incorrect claims.
arXiv Detail & Related papers (2024-08-23T03:16:26Z) - ManiTweet: A New Benchmark for Identifying Manipulation of News on Social Media [74.93847489218008]
We present a novel task, identifying manipulation of news on social media, which aims to detect manipulation in social media posts and identify manipulated or inserted information.
To study this task, we have proposed a data collection schema and curated a dataset called ManiTweet, consisting of 3.6K pairs of tweets and corresponding articles.
Our analysis demonstrates that this task is highly challenging, with large language models (LLMs) yielding unsatisfactory performance.
arXiv Detail & Related papers (2023-05-23T16:40:07Z) - Unveiling the Hidden Agenda: Biases in News Reporting and Consumption [59.55900146668931]
We build a six-year dataset on the Italian vaccine debate and adopt a Bayesian latent space model to identify narrative and selection biases.
We found a nonlinear relationship between biases and engagement, with higher engagement for extreme positions.
Analysis of news consumption on Twitter reveals common audiences among news outlets with similar ideological positions.
arXiv Detail & Related papers (2023-01-14T18:58:42Z) - Missing Counter-Evidence Renders NLP Fact-Checking Unrealistic for
Misinformation [67.69725605939315]
Misinformation emerges in times of uncertainty when credible information is limited.
This is challenging for NLP-based fact-checking as it relies on counter-evidence, which may not yet be available.
arXiv Detail & Related papers (2022-10-25T09:40:48Z) - FaVIQ: FAct Verification from Information-seeking Questions [77.7067957445298]
We construct a large-scale fact verification dataset called FaVIQ using information-seeking questions posed by real users.
Our claims are verified to be natural, contain little lexical bias, and require a complete understanding of the evidence for verification.
arXiv Detail & Related papers (2021-07-05T17:31:44Z) - A Survey on Predicting the Factuality and the Bias of News Media [29.032850263311342]
"The state of the art on media profiling for factuality and bias"
"Political bias detection, which in the Western political landscape is about predicting left-center-right bias"
"Recent advances in using different information sources and modalities"
arXiv Detail & Related papers (2021-03-16T11:11:54Z) - The Role of the Crowd in Countering Misinformation: A Case Study of the
COVID-19 Infodemic [15.885290526721544]
We focus on tweets related to the COVID-19 pandemic, analyzing the spread of misinformation, professional fact checks, and the crowd response to popular misleading claims about COVID-19.
We train a classifier to create a novel dataset of 155,468 COVID-19-related tweets, containing 33,237 false claims and 33,413 refuting arguments.
We observe that the surge in misinformation tweets results in a quick response and a corresponding increase in tweets that refute such misinformation.
arXiv Detail & Related papers (2020-11-11T13:48:44Z) - Explainable Automated Fact-Checking for Public Health Claims [11.529816799331979]
We present the first study of explainable fact-checking for claims which require specific expertise.
For our case study we choose the setting of public health.
We explore two tasks: veracity prediction and explanation generation.
arXiv Detail & Related papers (2020-10-19T23:51:33Z) - That is a Known Lie: Detecting Previously Fact-Checked Claims [34.30218503006579]
A large number of fact-checked claims have been accumulated.
Politicians like to repeat their favorite statements, true or false, over and over again.
It is important to try to save this effort and to avoid wasting time on claims that have already been fact-checked.
arXiv Detail & Related papers (2020-05-12T21:25:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.