Program Enhanced Fact Verification with Verbalization and Graph
Attention Network
- URL: http://arxiv.org/abs/2010.03084v6
- Date: Sun, 12 Sep 2021 02:16:57 GMT
- Title: Program Enhanced Fact Verification with Verbalization and Graph
Attention Network
- Authors: Xiaoyu Yang, Feng Nie, Yufei Feng, Quan Liu, Zhigang Chen, Xiaodan Zhu
- Abstract summary: We present a Program-enhanced Verbalization and Graph Attention Network (ProgVGAT) to integrate programs and execution into textual inference models.
We construct the graph attention verification networks, which are designed to fuse different sources of evidences from verbalized program execution, program structures, and the original statements and tables.
Experimental results show that the proposed framework achieves the new state-of-the-art performance, a 74.4% accuracy, on the benchmark dataset TABFACT.
- Score: 25.33739187395408
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Performing fact verification based on structured data is important for many
real-life applications and is a challenging research problem, particularly when
it involves both symbolic operations and informal inference based on language
understanding. In this paper, we present a Program-enhanced Verbalization and
Graph Attention Network (ProgVGAT) to integrate programs and execution into
textual inference models. Specifically, a verbalization with program execution
model is proposed to accumulate evidences that are embedded in operations over
the tables. Built on that, we construct the graph attention verification
networks, which are designed to fuse different sources of evidences from
verbalized program execution, program structures, and the original statements
and tables, to make the final verification decision. To support the above
framework, we propose a program selection module optimized with a new training
strategy based on margin loss, to produce more accurate programs, which is
shown to be effective in enhancing the final verification results. Experimental
results show that the proposed framework achieves the new state-of-the-art
performance, a 74.4% accuracy, on the benchmark dataset TABFACT.
Related papers
- FS-RAG: A Frame Semantics Based Approach for Improved Factual Accuracy in Large Language Models [2.1484130681985047]
We present a novel extension to Retrieval Augmented Generation with the goal of mitigating factual inaccuracies in the output of large language models.
Our method draws on the cognitive linguistic theory of frame semantics for the indexing and retrieval of factual information relevant to helping large language models answer queries.
arXiv Detail & Related papers (2024-06-23T17:18:19Z) - A Large-Scale Evaluation of Speech Foundation Models [110.95827399522204]
We establish the Speech processing Universal PERformance Benchmark (SUPERB) to study the effectiveness of the foundation model paradigm for speech.
We propose a unified multi-tasking framework to address speech processing tasks in SUPERB using a frozen foundation model followed by task-specialized, lightweight prediction heads.
arXiv Detail & Related papers (2024-04-15T00:03:16Z) - PERFOGRAPH: A Numerical Aware Program Graph Representation for
Performance Optimization and Program Analysis [12.778336318809092]
A key challenge in adopting the latest machine learning methods is the representation of programming languages.
To overcome the limitations and challenges of current program representations, we propose a graph-based program representation called PERFOGRAPH.
PerFOGRAPH can capture numerical information and the aggregate data structure by introducing new nodes and edges.
arXiv Detail & Related papers (2023-05-31T21:59:50Z) - Boosting Event Extraction with Denoised Structure-to-Text Augmentation [52.21703002404442]
Event extraction aims to recognize pre-defined event triggers and arguments from texts.
Recent data augmentation methods often neglect the problem of grammatical incorrectness.
We propose a denoised structure-to-text augmentation framework for event extraction DAEE.
arXiv Detail & Related papers (2023-05-16T16:52:07Z) - FactGraph: Evaluating Factuality in Summarization with Semantic Graph
Representations [114.94628499698096]
We propose FactGraph, a method that decomposes the document and the summary into structured meaning representations (MRs)
MRs describe core semantic concepts and their relations, aggregating the main content in both document and summary in a canonical form, and reducing data sparsity.
Experiments on different benchmarks for evaluating factuality show that FactGraph outperforms previous approaches by up to 15%.
arXiv Detail & Related papers (2022-04-13T16:45:33Z) - SDCUP: Schema Dependency-Enhanced Curriculum Pre-Training for Table
Semantic Parsing [19.779493883522072]
This paper designs two novel pre-training objectives to impose the desired inductive bias into the learned representations for table pre-training.
We propose a schema-aware curriculum learning approach to mitigate the impact of noise and learn effectively from the pre-training data in an easy-to-hard manner.
arXiv Detail & Related papers (2021-11-18T02:51:04Z) - Exploring Decomposition for Table-based Fact Verification [18.584226291619217]
We improve fact verification by decomposing complex statements into simpler subproblems.
Our proposed approach achieves the new state-of-the-art performance, an 82.7% accuracy, on the TabFact benchmark.
arXiv Detail & Related papers (2021-09-22T20:15:05Z) - Logic-level Evidence Retrieval and Graph-based Verification Network for
Table-based Fact Verification [16.30704888733592]
We formulate the table-based fact verification task as an evidence retrieval and reasoning framework.
Specifically, we first retrieve logic-level program-like evidence from the given table and statement as supplementary evidence for the table.
After that, we construct a logic-level graph to capture the logical relations between entities and functions in the retrieved evidence.
arXiv Detail & Related papers (2021-09-14T07:25:36Z) - Enforcing Consistency in Weakly Supervised Semantic Parsing [68.2211621631765]
We explore the use of consistency between the output programs for related inputs to reduce the impact of spurious programs.
We find that a more consistent formalism leads to improved model performance even without consistency-based training.
arXiv Detail & Related papers (2021-07-13T03:48:04Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z) - LogicalFactChecker: Leveraging Logical Operations for Fact Checking with
Graph Module Network [111.24773949467567]
We propose LogicalFactChecker, a neural network approach capable of leveraging logical operations for fact checking.
It achieves the state-of-the-art performance on TABFACT, a large-scale, benchmark dataset.
arXiv Detail & Related papers (2020-04-28T17:04:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.