FactKG: Fact Verification via Reasoning on Knowledge Graphs
- URL: http://arxiv.org/abs/2305.06590v2
- Date: Fri, 19 May 2023 03:50:52 GMT
- Title: FactKG: Fact Verification via Reasoning on Knowledge Graphs
- Authors: Jiho Kim, Sungjin Park, Yeonsu Kwon, Yohan Jo, James Thorne, Edward
Choi
- Abstract summary: We introduce a new dataset, FactKG: Fact Verification via Reasoning on Knowledge Graphs.
It consists of 108k natural language claims with five types of reasoning: One-hop, Conjunction, Existence, Multi-hop, and Negation.
We develop a baseline approach and analyze FactKG over these reasoning types.
- Score: 12.5437408722802
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In real world applications, knowledge graphs (KG) are widely used in various
domains (e.g. medical applications and dialogue agents). However, for fact
verification, KGs have not been adequately utilized as a knowledge source. KGs
can be a valuable knowledge source in fact verification due to their
reliability and broad applicability. A KG consists of nodes and edges which
makes it clear how concepts are linked together, allowing machines to reason
over chains of topics. However, there are many challenges in understanding how
these machine-readable concepts map to information in text. To enable the
community to better use KGs, we introduce a new dataset, FactKG: Fact
Verification via Reasoning on Knowledge Graphs. It consists of 108k natural
language claims with five types of reasoning: One-hop, Conjunction, Existence,
Multi-hop, and Negation. Furthermore, FactKG contains various linguistic
patterns, including colloquial style claims as well as written style claims to
increase practicality. Lastly, we develop a baseline approach and analyze
FactKG over these reasoning types. We believe FactKG can advance both
reliability and practicality in KG-based fact verification.
Related papers
- Decoding on Graphs: Faithful and Sound Reasoning on Knowledge Graphs through Generation of Well-Formed Chains [66.55612528039894]
Knowledge Graphs (KGs) can serve as reliable knowledge sources for question answering (QA)
We present DoG (Decoding on Graphs), a novel framework that facilitates a deep synergy between LLMs and KGs.
Experiments across various KGQA tasks with different background KGs demonstrate that DoG achieves superior and robust performance.
arXiv Detail & Related papers (2024-10-24T04:01:40Z) - Graph-constrained Reasoning: Faithful Reasoning on Knowledge Graphs with Large Language Models [83.28737898989694]
Large language models (LLMs) struggle with faithful reasoning due to knowledge gaps and hallucinations.
We introduce graph-constrained reasoning (GCR), a novel framework that bridges structured knowledge in KGs with unstructured reasoning in LLMs.
GCR achieves state-of-the-art performance and exhibits strong zero-shot generalizability to unseen KGs without additional training.
arXiv Detail & Related papers (2024-10-16T22:55:17Z) - A Prompt-Based Knowledge Graph Foundation Model for Universal In-Context Reasoning [17.676185326247946]
We propose a prompt-based KG foundation model via in-context learning, namely KG-ICL, to achieve a universal reasoning ability.
To encode prompt graphs with the generalization ability to unseen entities and relations in queries, we first propose a unified tokenizer.
Then, we propose two message passing neural networks to perform prompt encoding and KG reasoning, respectively.
arXiv Detail & Related papers (2024-10-16T06:47:18Z) - Context Graph [8.02985792541121]
We present a context graph reasoning textbfCGR$3$ paradigm that leverages large language models (LLMs) to retrieve candidate entities and related contexts.
Our experimental results demonstrate that CGR$3$ significantly improves performance on KG completion (KGC) and KG question answering (KGQA) tasks.
arXiv Detail & Related papers (2024-06-17T02:59:19Z) - Reasoning on Graphs: Faithful and Interpretable Large Language Model
Reasoning [104.92384929827776]
Large language models (LLMs) have demonstrated impressive reasoning abilities in complex tasks.
They lack up-to-date knowledge and experience hallucinations during reasoning.
Knowledge graphs (KGs) offer a reliable source of knowledge for reasoning.
arXiv Detail & Related papers (2023-10-02T10:14:43Z) - ChatRule: Mining Logical Rules with Large Language Models for Knowledge
Graph Reasoning [107.61997887260056]
We propose a novel framework, ChatRule, unleashing the power of large language models for mining logical rules over knowledge graphs.
Specifically, the framework is initiated with an LLM-based rule generator, leveraging both the semantic and structural information of KGs.
To refine the generated rules, a rule ranking module estimates the rule quality by incorporating facts from existing KGs.
arXiv Detail & Related papers (2023-09-04T11:38:02Z) - Reasoning over Multi-view Knowledge Graphs [59.99051368907095]
ROMA is a novel framework for answering logical queries over multi-view KGs.
It scales up to KGs of large sizes (e.g., millions of facts) and fine-granular views.
It generalizes to query structures and KG views that are unobserved during training.
arXiv Detail & Related papers (2022-09-27T21:32:20Z) - Trustworthy Knowledge Graph Completion Based on Multi-sourced Noisy Data [35.938323660176145]
We propose a new trustworthy method that exploits facts for a knowledge graph based on multi-sourced noisy data and existing facts in the KG.
Specifically, we introduce a graph neural network with a holistic scoring function to judge the plausibility of facts with various value types.
We present a truth inference model that incorporates data source qualities into the fact scoring function, and design a semi-supervised learning way to infer the truths from heterogeneous values.
arXiv Detail & Related papers (2022-01-21T07:59:16Z) - DegreEmbed: incorporating entity embedding into logic rule learning for
knowledge graph reasoning [7.066269573204757]
Link prediction for knowledge graphs is the task aiming to complete missing facts by reasoning based on the existing knowledge.
We propose DegreEmbed, a model that combines embedding-based learning and logic rule mining for inferring on KGs.
arXiv Detail & Related papers (2021-12-18T13:38:48Z) - Multilingual Knowledge Graph Completion via Ensemble Knowledge Transfer [43.453915033312114]
Predicting missing facts in a knowledge graph (KG) is a crucial task in knowledge base construction and reasoning.
We propose KEnS, a novel framework for embedding learning and ensemble knowledge transfer across a number of language-specific KGs.
Experiments on five real-world language-specific KGs show that KEnS consistently improves state-of-the-art methods on KG completion.
arXiv Detail & Related papers (2020-10-07T04:54:03Z) - Connecting the Dots: A Knowledgeable Path Generator for Commonsense
Question Answering [50.72473345911147]
This paper augments a general commonsense QA framework with a knowledgeable path generator.
By extrapolating over existing paths in a KG with a state-of-the-art language model, our generator learns to connect a pair of entities in text with a dynamic, and potentially novel, multi-hop relational path.
arXiv Detail & Related papers (2020-05-02T03:53:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.