A Logical Neural Network Structure With More Direct Mapping From Logical
Relations
- URL: http://arxiv.org/abs/2106.11463v1
- Date: Tue, 22 Jun 2021 00:53:08 GMT
- Title: A Logical Neural Network Structure With More Direct Mapping From Logical
Relations
- Authors: Gang Wang
- Abstract summary: It is prerequisite of representing and storing logical relations rightly into computer systems so as to make automatic judgement and decision.
Current numeric ANN models are good at perceptual intelligence such as image recognition while they are not good at cognitive intelligence such as logical representation.
This paper proposes a novel logical ANN model by designing the new logical neurons and links in demand of logical representation.
- Score: 8.239523696224975
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Logical relations widely exist in human activities. Human use them for making
judgement and decision according to various conditions, which are embodied in
the form of \emph{if-then} rules. As an important kind of cognitive
intelligence, it is prerequisite of representing and storing logical relations
rightly into computer systems so as to make automatic judgement and decision,
especially for high-risk domains like medical diagnosis. However, current
numeric ANN (Artificial Neural Network) models are good at perceptual
intelligence such as image recognition while they are not good at cognitive
intelligence such as logical representation, blocking the further application
of ANN. To solve it, researchers have tried to design logical ANN models to
represent and store logical relations. Although there are some advances in this
research area, recent works still have disadvantages because the structures of
these logical ANN models still don't map more directly with logical relations
which will cause the corresponding logical relations cannot be read out from
their network structures. Therefore, in order to represent logical relations
more clearly by the neural network structure and to read out logical relations
from it, this paper proposes a novel logical ANN model by designing the new
logical neurons and links in demand of logical representation. Compared with
the recent works on logical ANN models, this logical ANN model has more clear
corresponding with logical relations using the more direct mapping method
herein, thus logical relations can be read out following the connection
patterns of the network structure. Additionally, less neurons are used.
Related papers
- Convolutional Differentiable Logic Gate Networks [68.74313756770123]
We propose an approach for learning logic gate networks directly via a differentiable relaxation.
We build on this idea, extending it by deep logic gate tree convolutions and logical OR pooling.
On CIFAR-10, we achieve an accuracy of 86.29% using only 61 million logic gates, which improves over the SOTA while being 29x smaller.
arXiv Detail & Related papers (2024-11-07T14:12:00Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - Discourse-Aware Graph Networks for Textual Logical Reasoning [142.0097357999134]
Passage-level logical relations represent entailment or contradiction between propositional units (e.g., a concluding sentence)
We propose logic structural-constraint modeling to solve the logical reasoning QA and introduce discourse-aware graph networks (DAGNs)
The networks first construct logic graphs leveraging in-line discourse connectives and generic logic theories, then learn logic representations by end-to-end evolving the logic relations with an edge-reasoning mechanism and updating the graph features.
arXiv Detail & Related papers (2022-07-04T14:38:49Z) - Neuro-Symbolic Inductive Logic Programming with Logical Neural Networks [65.23508422635862]
We propose learning rules with the recently proposed logical neural networks (LNN)
Compared to others, LNNs offer strong connection to classical Boolean logic.
Our experiments on standard benchmarking tasks confirm that LNN rules are highly interpretable.
arXiv Detail & Related papers (2021-12-06T19:38:30Z) - Logic Tensor Networks [9.004005678155023]
We present Logic Networks (LTN), a neurosymbolic formalism and computational model that supports learning and reasoning.
We show that LTN provides a uniform language for the specification and the computation of several AI tasks.
arXiv Detail & Related papers (2020-12-25T22:30:18Z) - Neural Logic Reasoning [47.622957656745356]
We propose Logic-Integrated Neural Network (LINN) to integrate the power of deep learning and logic reasoning.
LINN learns basic logical operations such as AND, OR, NOT as neural modules, and conducts propositional logical reasoning through the network for inference.
Experiments show that LINN significantly outperforms state-of-the-art recommendation models in Top-K recommendation.
arXiv Detail & Related papers (2020-08-20T14:53:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.