Scales and Hedges in a Logic with Analogous Semantics
- URL: http://arxiv.org/abs/2201.08677v1
- Date: Fri, 21 Jan 2022 12:48:58 GMT
- Title: Scales and Hedges in a Logic with Analogous Semantics
- Authors: Hedda R. Schmidtke, Sara Coelho
- Abstract summary: Fuzzy Logic has a number of explanatory and application advantages, the most well-known being the ability to help experts develop control systems.
For social decision making in humans, it is vital that logical conclusions about others are grounded in empathic emotion.
This paper adds to the existing theory how scales, as necessary for adjective and verb semantics can be handled by the system.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Logics with analogous semantics, such as Fuzzy Logic, have a number of
explanatory and application advantages, the most well-known being the ability
to help experts develop control systems. From a cognitive systems perspective,
such languages also have the advantage of being grounded in perception. For
social decision making in humans, it is vital that logical conclusions about
others (cognitive empathy) are grounded in empathic emotion (affective
empathy). Classical Fuzzy Logic, however, has several disadvantages: it is not
obvious how complex formulae, e.g., the description of events in a text, can be
(a) formed, (b) grounded, and (c) used in logical reasoning. The two-layered
Context Logic (CL) was designed to address these issue. Formally based on a
lattice semantics, like classical Fuzzy Logic, CL also features an analogous
semantics for complex fomulae. With the Activation Bit Vector Machine (ABVM),
it has a simple and classical logical reasoning mechanism with an inherent
imagery process based on the Vector Symbolic Architecture (VSA) model of
distributed neuronal processing. This paper adds to the existing theory how
scales, as necessary for adjective and verb semantics can be handled by the
system.
Related papers
- Disentangling Logic: The Role of Context in Large Language Model Reasoning Capabilities [31.728976421529577]
We investigate the contrast across abstract and contextualized logical problems from a comprehensive set of domains.
We focus on standard propositional logic, specifically propositional deductive and abductive logic reasoning.
Our experiments aim to provide insights into disentangling context in logical reasoning and the true reasoning capabilities of LLMs.
arXiv Detail & Related papers (2024-06-04T21:25:06Z) - A Note on an Inferentialist Approach to Resource Semantics [48.65926948745294]
'Inferentialism' is the view that meaning is given in terms of inferential behaviour.
This paper shows how 'inferentialism' enables a versatile and expressive framework for resource semantics.
arXiv Detail & Related papers (2024-05-10T14:13:21Z) - Inferentialist Resource Semantics [48.65926948745294]
In systems modelling, a system typically comprises located resources relative to which processes execute.
This paper shows how inferentialism enables a versatile and expressive framework for resource semantics.
arXiv Detail & Related papers (2024-02-14T14:54:36Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Morpho-logic from a Topos Perspective: Application to symbolic AI [2.781492199939609]
Modal logics have proved useful for many reasoning tasks in symbolic artificial intelligence (AI)
We propose to further develop and generalize this link between mathematical morphology and modal logic from a topos perspective.
We show that the modal logic is well adapted to define concrete and efficient operators for revision, merging, and abduction of new knowledge, or even spatial reasoning.
arXiv Detail & Related papers (2023-03-08T21:24:25Z) - Discourse-Aware Graph Networks for Textual Logical Reasoning [142.0097357999134]
Passage-level logical relations represent entailment or contradiction between propositional units (e.g., a concluding sentence)
We propose logic structural-constraint modeling to solve the logical reasoning QA and introduce discourse-aware graph networks (DAGNs)
The networks first construct logic graphs leveraging in-line discourse connectives and generic logic theories, then learn logic representations by end-to-end evolving the logic relations with an edge-reasoning mechanism and updating the graph features.
arXiv Detail & Related papers (2022-07-04T14:38:49Z) - Logic-Driven Context Extension and Data Augmentation for Logical
Reasoning of Text [65.24325614642223]
We propose to understand logical symbols and expressions in the text to arrive at the answer.
Based on such logical information, we put forward a context extension framework and a data augmentation algorithm.
Our method achieves the state-of-the-art performance, and both logic-driven context extension framework and data augmentation algorithm can help improve the accuracy.
arXiv Detail & Related papers (2021-05-08T10:09:36Z) - Higher-order Logic as Lingua Franca -- Integrating Argumentative
Discourse and Deep Logical Analysis [0.0]
We present an approach towards the deep, pluralistic logical analysis of argumentative discourse.
We use state-of-the-art automated reasoning technology for classical higher-order logic.
arXiv Detail & Related papers (2020-07-02T11:07:53Z) - Logical Neural Networks [51.46602187496816]
We propose a novel framework seamlessly providing key properties of both neural nets (learning) and symbolic logic (knowledge and reasoning)
Every neuron has a meaning as a component of a formula in a weighted real-valued logic, yielding a highly intepretable disentangled representation.
Inference is omni rather than focused on predefined target variables, and corresponds to logical reasoning.
arXiv Detail & Related papers (2020-06-23T16:55:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.