Evidence-based Factual Error Correction
- URL: http://arxiv.org/abs/2106.01072v1
- Date: Wed, 2 Jun 2021 11:00:17 GMT
- Title: Evidence-based Factual Error Correction
- Authors: James Thorne, Andreas Vlachos
- Abstract summary: This paper introduces the task of factual error correction.
It provides a mechanism to correct written texts that are refuted or only partially supported by evidence.
- Score: 18.52583883901634
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper introduces the task of factual error correction: performing edits
to a claim so that the generated rewrite is better supported by evidence. This
extends the well-studied task of fact verification by providing a mechanism to
correct written texts that are refuted or only partially supported by evidence.
We demonstrate that it is feasible to train factual error correction systems
from existing fact checking datasets which only contain labeled claims
accompanied by evidence, but not the correction. We achieve this by employing a
two-stage distant supervision approach that incorporates evidence into masked
claims when generating corrections. Our approach, based on the T5 transformer
and using retrieved evidence, achieved better results than existing work which
used a pointer copy network and gold evidence, producing accurate factual error
corrections for 5x more instances in human evaluation and a .125 increase in
SARI score. The evaluation is conducted on a dataset of 65,000 instances based
on a recent fact verification shared task and we release it to enable further
work on the task.
Related papers
- Contrastive Learning to Improve Retrieval for Real-world Fact Checking [84.57583869042791]
We present Contrastive Fact-Checking Reranker (CFR), an improved retriever for fact-checking complex claims.
We leverage the AVeriTeC dataset, which annotates subquestions for claims with human written answers from evidence documents.
We find a 6% improvement in veracity classification accuracy on the dataset.
arXiv Detail & Related papers (2024-10-07T00:09:50Z) - Robust Claim Verification Through Fact Detection [17.29665711917281]
Our novel approach, FactDetect, leverages Large Language Models (LLMs) to generate concise factual statements from evidence.
The generated facts are then combined with the claim and evidence.
Our method demonstrates competitive results in the supervised claim verification model by 15% on the F1 score.
arXiv Detail & Related papers (2024-07-25T20:03:43Z) - GenAudit: Fixing Factual Errors in Language Model Outputs with Evidence [64.95492752484171]
We present GenAudit -- a tool intended to assist fact-checking LLM responses for document-grounded tasks.
We train models to execute these tasks, and design an interactive interface to present suggested edits and evidence to users.
To ensure that most errors are flagged by the system, we propose a method that can increase the error recall while minimizing impact on precision.
arXiv Detail & Related papers (2024-02-19T21:45:55Z) - Lyra: Orchestrating Dual Correction in Automated Theorem Proving [63.115422781158934]
Lyra is a new framework that employs two distinct correction mechanisms: Tool Correction and Conjecture Correction.
Tool Correction contributes to mitigating hallucinations, thereby improving the overall accuracy of the proof.
Conjecture Correction refines generation with instruction but does not collect paired (generation, error & refinement) prompts.
arXiv Detail & Related papers (2023-09-27T17:29:41Z) - Give Me More Details: Improving Fact-Checking with Latent Retrieval [58.706972228039604]
Evidence plays a crucial role in automated fact-checking.
Existing fact-checking systems either assume the evidence sentences are given or use the search snippets returned by the search engine.
We propose to incorporate full text from source documents as evidence and introduce two enriched datasets.
arXiv Detail & Related papers (2023-05-25T15:01:19Z) - Read it Twice: Towards Faithfully Interpretable Fact Verification by
Revisiting Evidence [59.81749318292707]
We propose a fact verification model named ReRead to retrieve evidence and verify claim.
The proposed system is able to achieve significant improvements upon best-reported models under different settings.
arXiv Detail & Related papers (2023-05-02T03:23:14Z) - Converge to the Truth: Factual Error Correction via Iterative
Constrained Editing [30.740281040892086]
We propose VENCE, a novel method for factual error correction (FEC) with minimal edits.
VENCE formulates the FEC problem as iterative sampling editing actions with respect to a target density function.
Experiments on a public dataset show that VENCE improves the well-adopted SARI metric by 5.3 (or a relative improvement of 11.8%) over the previous best distantly-supervised methods.
arXiv Detail & Related papers (2022-11-22T10:03:13Z) - Factual Error Correction for Abstractive Summaries Using Entity
Retrieval [57.01193722520597]
We propose an efficient factual error correction system RFEC based on entities retrieval post-editing process.
RFEC retrieves the evidence sentences from the original document by comparing the sentences with the target summary.
Next, RFEC detects the entity-level errors in the summaries by considering the evidence sentences and substitutes the wrong entities with the accurate entities from the evidence sentences.
arXiv Detail & Related papers (2022-04-18T11:35:02Z) - Factual Error Correction of Claims [18.52583883901634]
This paper introduces the task of factual error correction.
It provides a mechanism to correct written texts that contain misinformation.
It acts as an inherent explanation for claims already partially supported by evidence.
arXiv Detail & Related papers (2020-12-31T18:11:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.