On the Effect of Learned Clauses on Stochastic Local Search
- URL: http://arxiv.org/abs/2005.04022v1
- Date: Thu, 7 May 2020 13:33:16 GMT
- Title: On the Effect of Learned Clauses on Stochastic Local Search
- Authors: Jan-Hendrik Lorenz and Florian W\"orz
- Abstract summary: Conflict-driven clause learning (CDCL) and local search (SLS) are used in SAT solvers.
We experimentally demonstrate that clauses with a large number of correct literals are beneficial to the runtime of SLS.
We deduce the most beneficial strategies to add high-quality clauses as a preprocessing step.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: There are two competing paradigms in successful SAT solvers: Conflict-driven
clause learning (CDCL) and stochastic local search (SLS). CDCL uses systematic
exploration of the search space and has the ability to learn new clauses. SLS
examines the neighborhood of the current complete assignment. Unlike CDCL, it
lacks the ability to learn from its mistakes. This work revolves around the
question whether it is beneficial for SLS to add new clauses to the original
formula. We experimentally demonstrate that clauses with a large number of
correct literals w. r. t. a fixed solution are beneficial to the runtime of
SLS. We call such clauses high-quality clauses.
Empirical evaluations show that short clauses learned by CDCL possess the
high-quality attribute. We study several domains of randomly generated
instances and deduce the most beneficial strategies to add high-quality clauses
as a preprocessing step. The strategies are implemented in an SLS solver, and
it is shown that this considerably improves the state-of-the-art on randomly
generated instances. The results are statistically significant.
Related papers
- EXPLORA: Efficient Exemplar Subset Selection for Complex Reasoning [5.172620636569522]
Large language models (LLMs) have enabled in-context learning (ICL), allowing LLMs to acquire proficiency in a specific task using only a few demonstration samples (exemplars)
A critical challenge in ICL is the selection of optimal exemplars, which can be either task-specific (static) or test-example-specific (dynamic)
arXiv Detail & Related papers (2024-11-06T12:48:04Z) - Bayesian scaling laws for in-context learning [72.17734205418502]
In-context learning (ICL) is a powerful technique for getting language models to perform complex tasks with no training updates.
We show that ICL approximates a Bayesian learner and develop a family of novel Bayesian scaling laws for ICL.
arXiv Detail & Related papers (2024-10-21T21:45:22Z) - ParaICL: Towards Robust Parallel In-Context Learning [74.38022919598443]
Large language models (LLMs) have become the norm in natural language processing.
Few-shot in-context learning (ICL) relies on the choice of few-shot demonstration examples.
We propose a novel method named parallel in-context learning (ParaICL)
arXiv Detail & Related papers (2024-03-31T05:56:15Z) - Batch-ICL: Effective, Efficient, and Order-Agnostic In-Context Learning [27.729189318779603]
Batch-ICL is an effective, efficient, and order-agnostic inference algorithm for in-context learning.
We show that Batch-ICL consistently outperforms most permutations of ICL examples.
We also develop a novel variant of Batch-ICL featuring multiple "epochs" of meta-optimization.
arXiv Detail & Related papers (2024-01-12T09:31:17Z) - SCL(FOL) Can Simulate Non-Redundant Superposition Clause Learning [0.3867363075280543]
We show that SCL(FOL) can simulate the derivation of non-redundant clauses by superposition for first-order logic without equality.
Superposition-based reasoning is performed with respect to a fixed reduction ordering.
arXiv Detail & Related papers (2023-05-22T11:12:39Z) - Alleviating Over-smoothing for Unsupervised Sentence Representation [96.19497378628594]
We present a Simple method named Self-Contrastive Learning (SSCL) to alleviate this issue.
Our proposed method is quite simple and can be easily extended to various state-of-the-art models for performance boosting.
arXiv Detail & Related papers (2023-05-09T11:00:02Z) - Towards an Understanding of Long-Tailed Runtimes of SLS Algorithms [0.0]
GapSAT solver demonstrated a successful way to improve the performance of an SLS solver on average by learning additional information.
We propose a method for generating logically equivalent problem formulations.
We prove that the runtimes for Sch"oning's random walk algorithm are approximately Johnson SB.
arXiv Detail & Related papers (2022-10-24T12:22:25Z) - Estimating the hardness of SAT encodings for Logical Equivalence
Checking of Boolean circuits [58.83758257568434]
We show that the hardness of SAT encodings for LEC instances can be estimated textitw.r.t some SAT partitioning.
The paper proposes several methods for constructing partitionings, which, when used in practice, allow one to estimate the hardness of SAT encodings for LEC with good accuracy.
arXiv Detail & Related papers (2022-10-04T09:19:13Z) - Too much information: CDCL solvers need to forget and perform restarts [0.0]
Conflict-driven clause learning (CDCL) is a remarkably successful paradigm for solving the satisfiability problem of propositional logic.
This paper will demonstrate, quite surprisingly, that clause learning can not only improve the runtime but can oftentimes deteriorate it dramatically.
arXiv Detail & Related papers (2022-02-01T10:16:04Z) - Contrastive Learning with Adversarial Examples [79.39156814887133]
Contrastive learning (CL) is a popular technique for self-supervised learning (SSL) of visual representations.
This paper introduces a new family of adversarial examples for constrastive learning and using these examples to define a new adversarial training algorithm for SSL, denoted as CLAE.
arXiv Detail & Related papers (2020-10-22T20:45:10Z) - Joint Contrastive Learning with Infinite Possibilities [114.45811348666898]
This paper explores useful modifications of the recent development in contrastive learning via novel probabilistic modeling.
We derive a particular form of contrastive loss named Joint Contrastive Learning (JCL)
arXiv Detail & Related papers (2020-09-30T16:24:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.