SCL(FOL) Can Simulate Non-Redundant Superposition Clause Learning
- URL: http://arxiv.org/abs/2305.12926v1
- Date: Mon, 22 May 2023 11:12:39 GMT
- Title: SCL(FOL) Can Simulate Non-Redundant Superposition Clause Learning
- Authors: Martin Bromberger and Chaahat Jain and Christoph Weidenbach
- Abstract summary: We show that SCL(FOL) can simulate the derivation of non-redundant clauses by superposition for first-order logic without equality.
Superposition-based reasoning is performed with respect to a fixed reduction ordering.
- Score: 0.3867363075280543
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We show that SCL(FOL) can simulate the derivation of non-redundant clauses by
superposition for first-order logic without equality. Superposition-based
reasoning is performed with respect to a fixed reduction ordering. The
completeness proof of superposition relies on the grounding of the clause set.
It builds a ground partial model according to the fixed ordering, where minimal
false ground instances of clauses then trigger non-redundant superposition
inferences. We define a respective strategy for the SCL calculus such that
clauses learned by SCL and superposition inferences coincide. From this
perspective the SCL calculus can be viewed as a generalization of the
superposition calculus.
Related papers
- Decidable Fragments of LTLf Modulo Theories (Extended Version) [66.25779635347122]
In general,fMT was shown to be semi-decidable for any decidable first-order theory (e.g., linear arithmetics) with a tableau-based semi-decision procedure.
We show that for anyfMT formula satisfies an abstract, semantic condition, that we call finite memory, the tableau augmented with a new rule is also guaranteed to terminate.
arXiv Detail & Related papers (2023-07-31T17:02:23Z) - Resolution for Constrained Pseudo-Propositional Logic [0.0]
This work shows how propositional resolution can be generalized to obtain a resolution proof system for constrained pseudo-propositional logic.
Unlike the construction of CNF formulas which are restricted to a finite set of clauses, the extended CPPL does not require the corresponding set to be finite.
arXiv Detail & Related papers (2023-06-11T09:17:24Z) - What and How does In-Context Learning Learn? Bayesian Model Averaging,
Parameterization, and Generalization [111.55277952086155]
We study In-Context Learning (ICL) by addressing several open questions.
We show that, without updating the neural network parameters, ICL implicitly implements the Bayesian model averaging algorithm.
We prove that the error of pretrained model is bounded by a sum of an approximation error and a generalization error.
arXiv Detail & Related papers (2023-05-30T21:23:47Z) - Enhancing Adversarial Contrastive Learning via Adversarial Invariant
Regularization [59.77647907277523]
Adversarial contrastive learning (ACL) is a technique that enhances standard contrastive learning (SCL)
In this paper, we propose adversarial invariant regularization (AIR) to enforce independence from style factors.
arXiv Detail & Related papers (2023-04-30T03:12:21Z) - Supervised Contrastive Learning with Hard Negative Samples [16.42457033976047]
In contrastive learning (CL) learns a useful representation function by pulling positive samples close to each other.
In absence of class information, negative samples are chosen randomly and independently of the anchor.
Supervised CL (SCL) avoids this class collision by conditioning the negative sampling distribution to samples having labels different from that of the anchor.
arXiv Detail & Related papers (2022-08-31T19:20:04Z) - Too much information: CDCL solvers need to forget and perform restarts [0.0]
Conflict-driven clause learning (CDCL) is a remarkably successful paradigm for solving the satisfiability problem of propositional logic.
This paper will demonstrate, quite surprisingly, that clause learning can not only improve the runtime but can oftentimes deteriorate it dramatically.
arXiv Detail & Related papers (2022-02-01T10:16:04Z) - Superposition with Lambdas [59.87497175616048]
We design a superposition calculus for a clausal fragment of extensional polymorphic higher-order logic that includes anonymous functions but excludes Booleans.
The inference rules work on $betaeta$-equivalence classes of $lambda$-terms and rely on higher-order unification to achieve refutational completeness.
arXiv Detail & Related papers (2021-01-31T13:53:17Z) - On the Effect of Learned Clauses on Stochastic Local Search [0.0]
Conflict-driven clause learning (CDCL) and local search (SLS) are used in SAT solvers.
We experimentally demonstrate that clauses with a large number of correct literals are beneficial to the runtime of SLS.
We deduce the most beneficial strategies to add high-quality clauses as a preprocessing step.
arXiv Detail & Related papers (2020-05-07T13:33:16Z) - Superposition for Lambda-Free Higher-Order Logic [62.997667081978825]
We introduce refutationally complete superposition calculi for intentional and extensional clausal $lambda$-free higher-order logic.
The calculi are parameterized by a term order that need not be fully monotonic, making it possible to employ the $lambda$-free higher-order lexicographic path and Knuth-Bendix orders.
arXiv Detail & Related papers (2020-05-05T12:10:21Z) - Learning with Multiple Complementary Labels [94.8064553345801]
A complementary label (CL) simply indicates an incorrect class of an example, but learning with CLs results in multi-class classifiers.
We propose a novel problem setting to allow MCLs for each example and two ways for learning with MCLs.
arXiv Detail & Related papers (2019-12-30T13:50:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.