Differentiable Inductive Logic Programming in High-Dimensional Space
- URL: http://arxiv.org/abs/2208.06652v3
- Date: Sat, 19 Aug 2023 07:36:45 GMT
- Title: Differentiable Inductive Logic Programming in High-Dimensional Space
- Authors: Stanis{\l}aw J. Purga{\l}, David M. Cerna, Cezary Kaliszyk
- Abstract summary: We propose extending the deltaILP approach to inductive synthesis with large-scale predicate invention.
We show that large-scale predicate invention benefits differentiable inductive synthesis through gradient descent.
- Score: 6.21540494241516
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Synthesizing large logic programs through symbolic Inductive Logic
Programming (ILP) typically requires intermediate definitions. However,
cluttering the hypothesis space with intensional predicates typically degrades
performance. In contrast, gradient descent provides an efficient way to find
solutions within such high-dimensional spaces. Neuro-symbolic ILP approaches
have not fully exploited this so far. We propose extending the {\delta}ILP
approach to inductive synthesis with large-scale predicate invention, thus
allowing us to exploit the efficacy of high-dimensional gradient descent. We
show that large-scale predicate invention benefits differentiable inductive
synthesis through gradient descent and allows one to learn solutions for tasks
beyond the capabilities of existing neuro-symbolic ILP systems. Furthermore, we
achieve these results without specifying the precise structure of the solution
within the language bias.
Related papers
- Logic-induced Diagnostic Reasoning for Semi-supervised Semantic
Segmentation [85.12429517510311]
LogicDiag is a neural-logic semi-supervised learning framework for semantic segmentation.
Our key insight is that conflicts within pseudo labels, identified through symbolic knowledge, can serve as strong yet commonly ignored learning signals.
We showcase the practical application of LogicDiag in the data-hungry segmentation scenario, where we formalize the structured abstraction of semantic concepts as a set of logic rules.
arXiv Detail & Related papers (2023-08-24T06:50:07Z) - logLTN: Differentiable Fuzzy Logic in the Logarithm Space [11.440949097704943]
A trend in the literature involves integrating axioms and facts in loss functions by grounding logical symbols with neural networks and fuzzy semantics.
This paper presents a configuration of fuzzy operators for grounding formulas end-to-end in the logarithm space.
Our findings, both formal and empirical, show that the proposed configuration outperforms the state-of-the-art.
arXiv Detail & Related papers (2023-06-26T09:39:05Z) - Towards Invertible Semantic-Preserving Embeddings of Logical Formulae [1.0152838128195467]
Learning and optimising logic requirements and rules has always been an important problem in Artificial Intelligence.
Current methods are able to construct effective semantic-preserving embeddings via kernel methods, but the map they define is not invertible.
In this work we address this problem, learning how to invert such an embedding leveraging deep architectures based on the Graph Variational Autoencoder framework.
arXiv Detail & Related papers (2023-05-03T10:49:01Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Neural Combinatorial Logic Circuit Synthesis from Input-Output Examples [10.482805367361818]
We propose a novel, fully explainable neural approach to synthesis of logic circuits from input-output examples.
Our method can be employed for a virtually arbitrary choice of atoms.
arXiv Detail & Related papers (2022-10-29T14:06:42Z) - Semantic Probabilistic Layers for Neuro-Symbolic Learning [83.25785999205932]
We design a predictive layer for structured-output prediction (SOP)
It can be plugged into any neural network guaranteeing its predictions are consistent with a set of predefined symbolic constraints.
Our Semantic Probabilistic Layer (SPL) can model intricate correlations, and hard constraints, over a structured output space.
arXiv Detail & Related papers (2022-06-01T12:02:38Z) - GALOIS: Boosting Deep Reinforcement Learning via Generalizable Logic
Synthesis [34.54658276390227]
Deep reinforcement learning (DRL) lacks high-order intelligence regarding learning and generalization in complex problems.
Previous works attempt to directly synthesize a white-box logic program as the DRL policy, manifesting logic-driven behaviors.
We propose a novel Generalizable Logic Synthesis (GALOIS) framework to synthesize hierarchical and strict cause-effect logic programs.
arXiv Detail & Related papers (2022-05-27T02:50:13Z) - Activation Relaxation: A Local Dynamical Approximation to
Backpropagation in the Brain [62.997667081978825]
Activation Relaxation (AR) is motivated by constructing the backpropagation gradient as the equilibrium point of a dynamical system.
Our algorithm converges rapidly and robustly to the correct backpropagation gradients, requires only a single type of computational unit, and can operate on arbitrary computation graphs.
arXiv Detail & Related papers (2020-09-11T11:56:34Z) - Cogradient Descent for Bilinear Optimization [124.45816011848096]
We introduce a Cogradient Descent algorithm (CoGD) to address the bilinear problem.
We solve one variable by considering its coupling relationship with the other, leading to a synchronous gradient descent.
Our algorithm is applied to solve problems with one variable under the sparsity constraint.
arXiv Detail & Related papers (2020-06-16T13:41:54Z) - Physarum Powered Differentiable Linear Programming Layers and
Applications [48.77235931652611]
We propose an efficient and differentiable solver for general linear programming problems.
We show the use of our solver in a video segmentation task and meta-learning for few-shot learning.
arXiv Detail & Related papers (2020-04-30T01:50:37Z) - Towards Neural-Guided Program Synthesis for Linear Temporal Logic
Specifications [26.547133495699093]
We use a neural network to learn a Q-function that is then used to guide search, and to construct programs that are subsequently verified for correctness.
Our method is unique in combining search with deep learning to realize synthesis.
arXiv Detail & Related papers (2019-12-31T17:09:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.