Evaluating Relaxations of Logic for Neural Networks: A Comprehensive
Study
- URL: http://arxiv.org/abs/2107.13646v1
- Date: Wed, 28 Jul 2021 21:16:58 GMT
- Title: Evaluating Relaxations of Logic for Neural Networks: A Comprehensive
Study
- Authors: Mattia Medina Grespan, Ashim Gupta and Vivek Srikumar
- Abstract summary: We study the question of how best to relax logical expressions that represent labeled examples and knowledge about a problem.
We present theoretical and empirical criteria for characterizing which relaxation would perform best in various scenarios.
- Score: 17.998891912502092
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Symbolic knowledge can provide crucial inductive bias for training neural
models, especially in low data regimes. A successful strategy for incorporating
such knowledge involves relaxing logical statements into sub-differentiable
losses for optimization. In this paper, we study the question of how best to
relax logical expressions that represent labeled examples and knowledge about a
problem; we focus on sub-differentiable t-norm relaxations of logic. We present
theoretical and empirical criteria for characterizing which relaxation would
perform best in various scenarios. In our theoretical study driven by the goal
of preserving tautologies, the Lukasiewicz t-norm performs best. However, in
our empirical analysis on the text chunking and digit recognition tasks, the
product t-norm achieves best predictive performance. We analyze this apparent
discrepancy, and conclude with a list of best practices for defining loss
functions via logic.
Related papers
- On the Dynamics Under the Unhinged Loss and Beyond [104.49565602940699]
We introduce the unhinged loss, a concise loss function, that offers more mathematical opportunities to analyze closed-form dynamics.
The unhinged loss allows for considering more practical techniques, such as time-vary learning rates and feature normalization.
arXiv Detail & Related papers (2023-12-13T02:11:07Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Reduced Implication-bias Logic Loss for Neuro-Symbolic Learning [11.343715006460577]
Differentiable operators could bring a significant bias during backpropagation and degrade the performance of Neuro-Symbolic learning.
We propose a simple yet effective method to transform the biased loss functions into textitReduced Implication-bias Logic Loss.
Empirical study shows that RILL can achieve significant improvements compared with the biased logic loss functions.
arXiv Detail & Related papers (2022-08-14T11:57:46Z) - MERIt: Meta-Path Guided Contrastive Learning for Logical Reasoning [63.50909998372667]
We propose MERIt, a MEta-path guided contrastive learning method for logical ReasonIng of text.
Two novel strategies serve as indispensable components of our method.
arXiv Detail & Related papers (2022-03-01T11:13:00Z) - Neuro-Symbolic Inductive Logic Programming with Logical Neural Networks [65.23508422635862]
We propose learning rules with the recently proposed logical neural networks (LNN)
Compared to others, LNNs offer strong connection to classical Boolean logic.
Our experiments on standard benchmarking tasks confirm that LNN rules are highly interpretable.
arXiv Detail & Related papers (2021-12-06T19:38:30Z) - Neural Logic Reasoning [47.622957656745356]
We propose Logic-Integrated Neural Network (LINN) to integrate the power of deep learning and logic reasoning.
LINN learns basic logical operations such as AND, OR, NOT as neural modules, and conducts propositional logical reasoning through the network for inference.
Experiments show that LINN significantly outperforms state-of-the-art recommendation models in Top-K recommendation.
arXiv Detail & Related papers (2020-08-20T14:53:23Z) - Logical Neural Networks [51.46602187496816]
We propose a novel framework seamlessly providing key properties of both neural nets (learning) and symbolic logic (knowledge and reasoning)
Every neuron has a meaning as a component of a formula in a weighted real-valued logic, yielding a highly intepretable disentangled representation.
Inference is omni rather than focused on predefined target variables, and corresponds to logical reasoning.
arXiv Detail & Related papers (2020-06-23T16:55:45Z) - Analyzing Differentiable Fuzzy Logic Operators [3.4806267677524896]
We study how a large collection of logical operators from the fuzzy logic literature behave in a differentiable learning setting.
We show that it is possible to use Differentiable Fuzzy Logics for semi-supervised learning, and compare how different operators behave in practice.
arXiv Detail & Related papers (2020-02-14T16:11:36Z) - Relational Neural Machines [19.569025323453257]
This paper presents a novel framework allowing jointly train the parameters of the learners and of a First-Order Logic based reasoner.
A Neural Machine is able recover both classical learning results in case of pure sub-symbolic learning, and Markov Logic Networks.
Proper algorithmic solutions are devised to make learning and inference tractable in large-scale problems.
arXiv Detail & Related papers (2020-02-06T10:53:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.