Scalable Neural-Probabilistic Answer Set Programming
- URL: http://arxiv.org/abs/2306.08397v1
- Date: Wed, 14 Jun 2023 09:45:29 GMT
- Title: Scalable Neural-Probabilistic Answer Set Programming
- Authors: Arseny Skryagin and Daniel Ochs and Devendra Singh Dhami and Kristian
Kersting
- Abstract summary: We introduce SLASH, a novel DPPL that consists of Neural-Probabilistic Predicates (NPPs) and a logic program, united via answer set programming (ASP)
We show how to prune the insignificantally insignificant parts of the (ground) program, speeding up reasoning without sacrificing the predictive performance.
We evaluate SLASH on a variety of different tasks, including the benchmark task of MNIST addition and Visual Question Answering (VQA)
- Score: 18.136093815001423
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The goal of combining the robustness of neural networks and the
expressiveness of symbolic methods has rekindled the interest in Neuro-Symbolic
AI. Deep Probabilistic Programming Languages (DPPLs) have been developed for
probabilistic logic programming to be carried out via the probability
estimations of deep neural networks. However, recent SOTA DPPL approaches allow
only for limited conditional probabilistic queries and do not offer the power
of true joint probability estimation. In our work, we propose an easy
integration of tractable probabilistic inference within a DPPL. To this end, we
introduce SLASH, a novel DPPL that consists of Neural-Probabilistic Predicates
(NPPs) and a logic program, united via answer set programming (ASP). NPPs are a
novel design principle allowing for combining all deep model types and
combinations thereof to be represented as a single probabilistic predicate. In
this context, we introduce a novel $+/-$ notation for answering various types
of probabilistic queries by adjusting the atom notations of a predicate. To
scale well, we show how to prune the stochastically insignificant parts of the
(ground) program, speeding up reasoning without sacrificing the predictive
performance. We evaluate SLASH on a variety of different tasks, including the
benchmark task of MNIST addition and Visual Question Answering (VQA).
Related papers
- Towards Probabilistic Inductive Logic Programming with Neurosymbolic Inference and Relaxation [0.0]
We propose Propper, which handles flawed and probabilistic background knowledge.
For relational patterns in noisy images, Propper can learn programs from as few as 8 examples.
It outperforms binary ILP and statistical models such as a Graph Neural Network.
arXiv Detail & Related papers (2024-08-21T06:38:49Z) - dPASP: A Comprehensive Differentiable Probabilistic Answer Set
Programming Environment For Neurosymbolic Learning and Reasoning [0.0]
We present dPASP, a novel declarative logic programming framework for differentiable neuro-symbolic reasoning.
We discuss the several semantics for probabilistic logic programs that can express nondeterministic, contradictory, incomplete and/or statistical knowledge.
We then describe an implemented package that supports inference and learning in the language, along with several example programs.
arXiv Detail & Related papers (2023-08-05T19:36:58Z) - Neural Probabilistic Logic Programming in Discrete-Continuous Domains [9.94537457589893]
Neural-symbolic AI (NeSy) allows neural networks to exploit symbolic background knowledge in the form of logic.
Probabilistic NeSy focuses on integrating neural networks with both logic and probability theory.
DeepSeaProbLog is a neural probabilistic logic programming language that incorporates DPP techniques into NeSy.
arXiv Detail & Related papers (2023-03-08T15:27:29Z) - Semantic Strengthening of Neuro-Symbolic Learning [85.6195120593625]
Neuro-symbolic approaches typically resort to fuzzy approximations of a probabilistic objective.
We show how to compute this efficiently for tractable circuits.
We test our approach on three tasks: predicting a minimum-cost path in Warcraft, predicting a minimum-cost perfect matching, and solving Sudoku puzzles.
arXiv Detail & Related papers (2023-02-28T00:04:22Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - plingo: A system for probabilistic reasoning in clingo based on lpmln [2.7742922296398738]
We present plingo, an extension of the ASP system clingo with various probabilistic reasoning modes.
Plingo is centered upon LPMLN, a probabilistic extension of ASP based on a weight scheme from Markov Logic.
We evaluate plingo's performance empirically by comparing it to other probabilistic systems.
arXiv Detail & Related papers (2022-06-23T07:51:10Z) - Semantic Probabilistic Layers for Neuro-Symbolic Learning [83.25785999205932]
We design a predictive layer for structured-output prediction (SOP)
It can be plugged into any neural network guaranteeing its predictions are consistent with a set of predefined symbolic constraints.
Our Semantic Probabilistic Layer (SPL) can model intricate correlations, and hard constraints, over a structured output space.
arXiv Detail & Related papers (2022-06-01T12:02:38Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - SLASH: Embracing Probabilistic Circuits into Neural Answer Set
Programming [15.814914345000574]
We introduce SLASH -- a novel deep probabilistic programming language (DPPL)
At its core, SLASH consists of Neural-Probabilistic Predicates (NPPs) and logical programs which are united via answer set programming.
We evaluate SLASH on the benchmark data of MNIST addition as well as novel tasks for DPPLs such as missing data prediction and set prediction with state-of-the-art performance.
arXiv Detail & Related papers (2021-10-07T12:35:55Z) - Statistically Meaningful Approximation: a Case Study on Approximating
Turing Machines with Transformers [50.85524803885483]
This work proposes a formal definition of statistically meaningful (SM) approximation which requires the approximating network to exhibit good statistical learnability.
We study SM approximation for two function classes: circuits and Turing machines.
arXiv Detail & Related papers (2021-07-28T04:28:55Z) - Mitigating Performance Saturation in Neural Marked Point Processes:
Architectures and Loss Functions [50.674773358075015]
We propose a simple graph-based network structure called GCHP, which utilizes only graph convolutional layers.
We show that GCHP can significantly reduce training time and the likelihood ratio loss with interarrival time probability assumptions can greatly improve the model performance.
arXiv Detail & Related papers (2021-07-07T16:59:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.