Question Answering over Knowledge Bases by Leveraging Semantic Parsing
and Neuro-Symbolic Reasoning
- URL: http://arxiv.org/abs/2012.01707v1
- Date: Thu, 3 Dec 2020 05:17:55 GMT
- Title: Question Answering over Knowledge Bases by Leveraging Semantic Parsing
and Neuro-Symbolic Reasoning
- Authors: Pavan Kapanipathi, Ibrahim Abdelaziz, Srinivas Ravishankar, Salim
Roukos, Alexander Gray, Ramon Astudillo, Maria Chang, Cristina Cornelio,
Saswati Dana, Achille Fokoue, Dinesh Garg, Alfio Gliozzo, Sairam Gurajada,
Hima Karanam, Naweed Khan, Dinesh Khandelwal, Young-Suk Lee, Yunyao Li,
Francois Luus, Ndivhuwo Makondo, Nandana Mihindukulasooriya, Tahira Naseem,
Sumit Neelam, Lucian Popa, Revanth Reddy, Ryan Riegel, Gaetano Rossiello,
Udit Sharma, G P Shrivatsa Bhargav, Mo Yu
- Abstract summary: We propose a semantic parsing and reasoning-based Neuro-Symbolic Question Answering(NSQA) system.
NSQA achieves state-of-the-art performance on QALD-9 and LC-QuAD 1.0.
- Score: 73.00049753292316
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge base question answering (KBQA) is an important task in Natural
Language Processing. Existing approaches face significant challenges including
complex question understanding, necessity for reasoning, and lack of large
training datasets. In this work, we propose a semantic parsing and
reasoning-based Neuro-Symbolic Question Answering(NSQA) system, that leverages
(1) Abstract Meaning Representation (AMR) parses for task-independent question
under-standing; (2) a novel path-based approach to transform AMR parses into
candidate logical queries that are aligned to the KB; (3) a neuro-symbolic
reasoner called Logical Neural Net-work (LNN) that executes logical queries and
reasons over KB facts to provide an answer; (4) system of systems
approach,which integrates multiple, reusable modules that are trained
specifically for their individual tasks (e.g. semantic parsing,entity linking,
and relationship linking) and do not require end-to-end training data. NSQA
achieves state-of-the-art performance on QALD-9 and LC-QuAD 1.0. NSQA's novelty
lies in its modular neuro-symbolic architecture and its task-general approach
to interpreting natural language questions.
Related papers
- ProSLM : A Prolog Synergized Language Model for explainable Domain Specific Knowledge Based Question Answering [0.0]
Neurosymbolic approaches can add robustness to opaque neural systems by incorporating explainable symbolic representations.
We propose systemname, a novel neurosymbolic framework, to improve robustness and reliability of large language models.
Our work opens a new area of neurosymbolic generative AI text validation and user personalization.
arXiv Detail & Related papers (2024-09-17T22:34:33Z) - Interactive-KBQA: Multi-Turn Interactions for Knowledge Base Question Answering with Large Language Models [7.399563588835834]
Interactive-KBQA is a framework designed to generate logical forms through direct interaction with knowledge bases (KBs)
Our method achieves competitive results on the WebQuestionsSP, ComplexWebQuestions, KQA Pro, and MetaQA datasets.
arXiv Detail & Related papers (2024-02-23T06:32:18Z) - ChatKBQA: A Generate-then-Retrieve Framework for Knowledge Base Question Answering with Fine-tuned Large Language Models [19.85526116658481]
We introduce ChatKBQA, a novel and simple generate-then-retrieve KBQA framework.
Experimental results show that ChatKBQA achieves new state-of-the-art performance on standard KBQA datasets.
This work can also be regarded as a new paradigm for combining LLMs with knowledge graphs for interpretable and knowledge-required question answering.
arXiv Detail & Related papers (2023-10-13T09:45:14Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - HPE:Answering Complex Questions over Text by Hybrid Question Parsing and
Execution [92.69684305578957]
We propose a framework of question parsing and execution on textual QA.
The proposed framework can be viewed as a top-down question parsing followed by a bottom-up answer backtracking.
Our experiments on MuSiQue, 2WikiQA, HotpotQA, and NQ show that the proposed parsing and hybrid execution framework outperforms existing approaches in supervised, few-shot, and zero-shot settings.
arXiv Detail & Related papers (2023-05-12T22:37:06Z) - Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z) - elBERto: Self-supervised Commonsense Learning for Question Answering [131.51059870970616]
We propose a Self-supervised Bidirectional Representation Learning of Commonsense framework, which is compatible with off-the-shelf QA model architectures.
The framework comprises five self-supervised tasks to force the model to fully exploit the additional training signals from contexts containing rich commonsense.
elBERto achieves substantial improvements on out-of-paragraph and no-effect questions where simple lexical similarity comparison does not help.
arXiv Detail & Related papers (2022-03-17T16:23:45Z) - Leveraging Semantic Parsing for Relation Linking over Knowledge Bases [80.99588366232075]
We present SLING, a relation linking framework which leverages semantic parsing using AMR and distant supervision.
SLING integrates multiple relation linking approaches that capture complementary signals such as linguistic cues, rich semantic representation, and information from the knowledgebase.
experiments on relation linking using three KBQA datasets; QALD-7, QALD-9, and LC-QuAD 1.0 demonstrate that the proposed approach achieves state-of-the-art performance on all benchmarks.
arXiv Detail & Related papers (2020-09-16T14:56:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.