Logic-level Evidence Retrieval and Graph-based Verification Network for
Table-based Fact Verification
- URL: http://arxiv.org/abs/2109.06480v1
- Date: Tue, 14 Sep 2021 07:25:36 GMT
- Title: Logic-level Evidence Retrieval and Graph-based Verification Network for
Table-based Fact Verification
- Authors: Qi Shi, Yu Zhang, Qingyu Yin, Ting Liu
- Abstract summary: We formulate the table-based fact verification task as an evidence retrieval and reasoning framework.
Specifically, we first retrieve logic-level program-like evidence from the given table and statement as supplementary evidence for the table.
After that, we construct a logic-level graph to capture the logical relations between entities and functions in the retrieved evidence.
- Score: 16.30704888733592
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Table-based fact verification task aims to verify whether the given statement
is supported by the given semi-structured table. Symbolic reasoning with
logical operations plays a crucial role in this task. Existing methods leverage
programs that contain rich logical information to enhance the verification
process. However, due to the lack of fully supervised signals in the program
generation process, spurious programs can be derived and employed, which leads
to the inability of the model to catch helpful logical operations. To address
the aforementioned problems, in this work, we formulate the table-based fact
verification task as an evidence retrieval and reasoning framework, proposing
the Logic-level Evidence Retrieval and Graph-based Verification network
(LERGV). Specifically, we first retrieve logic-level program-like evidence from
the given table and statement as supplementary evidence for the table. After
that, we construct a logic-level graph to capture the logical relations between
entities and functions in the retrieved evidence, and design a graph-based
verification network to perform logic-level graph-based reasoning based on the
constructed graph to classify the final entailment relation. Experimental
results on the large-scale benchmark TABFACT show the effectiveness of the
proposed approach.
Related papers
- Chain-of-Table: Evolving Tables in the Reasoning Chain for Table
Understanding [79.9461269253121]
We propose the Chain-of-Table framework, where tabular data is explicitly used in the reasoning chain as a proxy for intermediate thoughts.
Chain-of-Table achieves new state-of-the-art performance on WikiTQ, FeTaQA, and TabFact benchmarks.
arXiv Detail & Related papers (2024-01-09T07:46:26Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - Investigating the Robustness of Natural Language Generation from Logical
Forms via Counterfactual Samples [30.079030298066847]
State-of-the-art methods based on pre-trained models have achieved remarkable performance on the standard test dataset.
We question whether these methods really learn how to perform logical reasoning, rather than just relying on the spurious correlations between the headers of the tables and operators of the logical form.
We propose two approaches to reduce the model's reliance on the shortcut.
arXiv Detail & Related papers (2022-10-16T14:14:53Z) - Discourse-Aware Graph Networks for Textual Logical Reasoning [142.0097357999134]
Passage-level logical relations represent entailment or contradiction between propositional units (e.g., a concluding sentence)
We propose logic structural-constraint modeling to solve the logical reasoning QA and introduce discourse-aware graph networks (DAGNs)
The networks first construct logic graphs leveraging in-line discourse connectives and generic logic theories, then learn logic representations by end-to-end evolving the logic relations with an edge-reasoning mechanism and updating the graph features.
arXiv Detail & Related papers (2022-07-04T14:38:49Z) - PLOG: Table-to-Logic Pretraining for Logical Table-to-Text Generation [44.78200830757109]
We propose a PLOG (Pretrained Logical Form Generator) framework to improve the generation fidelity.
PLOG is first pretrained on a table-to-logic-form generation task, then finetuned on downstream table-to-text tasks.
PLOG can learn logical inference from table-logic pairs much more definitely than from table-text pairs.
arXiv Detail & Related papers (2022-05-25T11:55:54Z) - LogiGAN: Learning Logical Reasoning via Adversarial Pre-training [58.11043285534766]
We present LogiGAN, an unsupervised adversarial pre-training framework for improving logical reasoning abilities of language models.
Inspired by the facilitation effect of reflective thinking in human learning, we simulate the learning-thinking process with an adversarial Generator-Verifier architecture.
Both base and large size language models pre-trained with LogiGAN demonstrate obvious performance improvement on 12 datasets.
arXiv Detail & Related papers (2022-05-18T08:46:49Z) - Program Enhanced Fact Verification with Verbalization and Graph
Attention Network [25.33739187395408]
We present a Program-enhanced Verbalization and Graph Attention Network (ProgVGAT) to integrate programs and execution into textual inference models.
We construct the graph attention verification networks, which are designed to fuse different sources of evidences from verbalized program execution, program structures, and the original statements and tables.
Experimental results show that the proposed framework achieves the new state-of-the-art performance, a 74.4% accuracy, on the benchmark dataset TABFACT.
arXiv Detail & Related papers (2020-10-06T23:29:08Z) - LogicalFactChecker: Leveraging Logical Operations for Fact Checking with
Graph Module Network [111.24773949467567]
We propose LogicalFactChecker, a neural network approach capable of leveraging logical operations for fact checking.
It achieves the state-of-the-art performance on TABFACT, a large-scale, benchmark dataset.
arXiv Detail & Related papers (2020-04-28T17:04:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.