Defeasible Reasoning with Knowledge Graphs
- URL: http://arxiv.org/abs/2309.12731v1
- Date: Fri, 22 Sep 2023 09:27:26 GMT
- Title: Defeasible Reasoning with Knowledge Graphs
- Authors: Dave Raggett
- Abstract summary: This paper introduces work on an intuitive notation and model for defeasible reasoning with imperfect knowledge.
The paper closes with observations on symbolic approaches in the era of large language models.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Human knowledge is subject to uncertainties, imprecision, incompleteness and
inconsistencies. Moreover, the meaning of many everyday terms is dependent on
the context. That poses a huge challenge for the Semantic Web. This paper
introduces work on an intuitive notation and model for defeasible reasoning
with imperfect knowledge, and relates it to previous work on argumentation
theory. PKN is to N3 as defeasible reasoning is to deductive logic. Further
work is needed on an intuitive syntax for describing reasoning strategies and
tactics in declarative terms, drawing upon the AIF ontology for inspiration.
The paper closes with observations on symbolic approaches in the era of large
language models.
Related papers
- Conceptual and Unbiased Reasoning in Language Models [98.90677711523645]
We propose a novel conceptualization framework that forces models to perform conceptual reasoning on abstract questions.
We show that existing large language models fall short on conceptual reasoning, dropping 9% to 28% on various benchmarks.
We then discuss how models can improve since high-level abstract reasoning is key to unbiased and generalizable decision-making.
arXiv Detail & Related papers (2024-03-30T00:53:53Z) - UNcommonsense Reasoning: Abductive Reasoning about Uncommon Situations [62.71847873326847]
We investigate the ability to model unusual, unexpected, and unlikely situations.
Given a piece of context with an unexpected outcome, this task requires reasoning abductively to generate an explanation.
We release a new English language corpus called UNcommonsense.
arXiv Detail & Related papers (2023-11-14T19:00:55Z) - DiPlomat: A Dialogue Dataset for Situated Pragmatic Reasoning [89.92601337474954]
Pragmatic reasoning plays a pivotal role in deciphering implicit meanings that frequently arise in real-life conversations.
We introduce a novel challenge, DiPlomat, aiming at benchmarking machines' capabilities on pragmatic reasoning and situated conversational understanding.
arXiv Detail & Related papers (2023-06-15T10:41:23Z) - Large Language Models are In-Context Semantic Reasoners rather than
Symbolic Reasoners [75.85554779782048]
Large Language Models (LLMs) have excited the natural language and machine learning community over recent years.
Despite of numerous successful applications, the underlying mechanism of such in-context capabilities still remains unclear.
In this work, we hypothesize that the learned textitsemantics of language tokens do the most heavy lifting during the reasoning process.
arXiv Detail & Related papers (2023-05-24T07:33:34Z) - Natural Language Reasoning, A Survey [16.80326702160048]
Conceptually, we provide a distinct definition for natural language reasoning in NLP.
We conduct a comprehensive literature review on natural language reasoning in NLP.
The paper also identifies and views backward reasoning, a powerful paradigm for multi-step reasoning.
arXiv Detail & Related papers (2023-03-26T13:44:18Z) - Logical Reasoning over Natural Language as Knowledge Representation: A
Survey [43.29703101875716]
This paper provides an overview on a new paradigm of logical reasoning, which uses natural language as knowledge representation and pretrained language models as reasoners.
This new paradigm is promising since it not only alleviates many challenges of formal representation but also has advantages over end-to-end neural methods.
arXiv Detail & Related papers (2023-03-21T16:56:05Z) - MetaLogic: Logical Reasoning Explanations with Fine-Grained Structure [129.8481568648651]
We propose a benchmark to investigate models' logical reasoning capabilities in complex real-life scenarios.
Based on the multi-hop chain of reasoning, the explanation form includes three main components.
We evaluate the current best models' performance on this new explanation form.
arXiv Detail & Related papers (2022-10-22T16:01:13Z) - Thirty years of Epistemic Specifications [8.339560855135575]
We extend disjunctive logic programs under the stable model semantics with modal constructs called subjective literals.
Using subjective literals, it is possible to check whether a regular literal is true in every or some stable models of the program.
Several attempts for capturing the intuitions underlying the language by means of a formal semantics were given.
arXiv Detail & Related papers (2021-08-17T15:03:10Z) - Social Commonsense Reasoning with Multi-Head Knowledge Attention [24.70946979449572]
Social Commonsense Reasoning requires understanding of text, knowledge about social events and their pragmatic implications, as well as commonsense reasoning skills.
We propose a novel multi-head knowledge attention model that encodes semi-structured commonsense inference rules and learns to incorporate them in a transformer-based reasoning cell.
arXiv Detail & Related papers (2020-10-12T10:24:40Z) - Logical Neural Networks [51.46602187496816]
We propose a novel framework seamlessly providing key properties of both neural nets (learning) and symbolic logic (knowledge and reasoning)
Every neuron has a meaning as a component of a formula in a weighted real-valued logic, yielding a highly intepretable disentangled representation.
Inference is omni rather than focused on predefined target variables, and corresponds to logical reasoning.
arXiv Detail & Related papers (2020-06-23T16:55:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.