Using deep learning to construct stochastic local search SAT solvers
with performance bounds
- URL: http://arxiv.org/abs/2309.11452v2
- Date: Sat, 23 Sep 2023 10:04:53 GMT
- Title: Using deep learning to construct stochastic local search SAT solvers
with performance bounds
- Authors: Maximilian Kramer, Paul Boes
- Abstract summary: We train oracles using Graph Neural Networks and evaluate them on two SLS solvers on random SAT instances of varying difficulty.
We find that access to GNN-based oracles significantly boosts the performance of both solvers.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The Boolean Satisfiability problem (SAT) is the most prototypical NP-complete
problem and of great practical relevance. One important class of solvers for
this problem are stochastic local search (SLS) algorithms that iteratively and
randomly update a candidate assignment. Recent breakthrough results in
theoretical computer science have established sufficient conditions under which
SLS solvers are guaranteed to efficiently solve a SAT instance, provided they
have access to suitable "oracles" that provide samples from an
instance-specific distribution, exploiting an instance's local structure.
Motivated by these results and the well established ability of neural networks
to learn common structure in large datasets, in this work, we train oracles
using Graph Neural Networks and evaluate them on two SLS solvers on random SAT
instances of varying difficulty. We find that access to GNN-based oracles
significantly boosts the performance of both solvers, allowing them, on
average, to solve 17% more difficult instances (as measured by the ratio
between clauses and variables), and to do so in 35% fewer steps, with
improvements in the median number of steps of up to a factor of 8. As such,
this work bridges formal results from theoretical computer science and
practically motivated research on deep learning for constraint satisfaction
problems and establishes the promise of purpose-trained SAT solvers with
performance guarantees.
Related papers
- torchmSAT: A GPU-Accelerated Approximation To The Maximum Satisfiability
Problem [1.5850859526672516]
We derive a single differentiable function capable of approximating solutions for the Maximum Satisfiability Problem (MaxSAT)
We present a novel neural network architecture to model our differentiable function, and progressively solve MaxSAT using backpropagation.
arXiv Detail & Related papers (2024-02-06T02:33:00Z) - G4SATBench: Benchmarking and Advancing SAT Solving with Graph Neural Networks [7.951021955925275]
Graph neural networks (GNNs) have emerged as a promising approach for solving the Boolean Satisfiability Problem (SAT)
G4SATBench is the first benchmark study that establishes a comprehensive evaluation framework for GNN-based SAT solvers.
Our results provide valuable insights into the performance of GNN-based SAT solvers.
arXiv Detail & Related papers (2023-09-29T02:50:57Z) - W2SAT: Learning to generate SAT instances from Weighted Literal Incidence Graphs [11.139131079925113]
W2SAT is a framework to generate SAT formulas by learning intrinsic structures and properties from given real-world/industrial instances.
We introduce a novel SAT representation called Weighted Literal Incidence Graph (WLIG), which exhibits strong representation ability and generalizability.
Decoding from WLIG into SAT problems is then modeled as finding overlapping cliques with a novel hill-climbing optimization method.
arXiv Detail & Related papers (2023-02-01T06:30:41Z) - Estimating the hardness of SAT encodings for Logical Equivalence
Checking of Boolean circuits [58.83758257568434]
We show that the hardness of SAT encodings for LEC instances can be estimated textitw.r.t some SAT partitioning.
The paper proposes several methods for constructing partitionings, which, when used in practice, allow one to estimate the hardness of SAT encodings for LEC with good accuracy.
arXiv Detail & Related papers (2022-10-04T09:19:13Z) - DeepSAT: An EDA-Driven Learning Framework for SAT [9.111341161918375]
We present DeepSAT, a novel end-to-end learning framework for the Boolean satisfiability (SAT) problem.
DeepSAT achieves significant accuracy improvements over state-of-the-art learning-based SAT solutions.
arXiv Detail & Related papers (2022-05-27T03:20:42Z) - Machine Learning Methods in Solving the Boolean Satisfiability Problem [72.21206588430645]
The paper reviews the recent literature on solving the Boolean satisfiability problem (SAT) with machine learning techniques.
We examine the evolving ML-SAT solvers from naive classifiers with handcrafted features to the emerging end-to-end SAT solvers such as NeuroSAT.
arXiv Detail & Related papers (2022-03-02T05:14:12Z) - Simple Stochastic and Online Gradient DescentAlgorithms for Pairwise
Learning [65.54757265434465]
Pairwise learning refers to learning tasks where the loss function depends on a pair instances.
Online descent (OGD) is a popular approach to handle streaming data in pairwise learning.
In this paper, we propose simple and online descent to methods for pairwise learning.
arXiv Detail & Related papers (2021-11-23T18:10:48Z) - Generalization of Neural Combinatorial Solvers Through the Lens of
Adversarial Robustness [68.97830259849086]
Most datasets only capture a simpler subproblem and likely suffer from spurious features.
We study adversarial robustness - a local generalization property - to reveal hard, model-specific instances and spurious features.
Unlike in other applications, where perturbation models are designed around subjective notions of imperceptibility, our perturbation models are efficient and sound.
Surprisingly, with such perturbations, a sufficiently expressive neural solver does not suffer from the limitations of the accuracy-robustness trade-off common in supervised learning.
arXiv Detail & Related papers (2021-10-21T07:28:11Z) - Transformer-based Machine Learning for Fast SAT Solvers and Logic
Synthesis [63.53283025435107]
CNF-based SAT and MaxSAT solvers are central to logic synthesis and verification systems.
In this work, we propose a one-shot model derived from the Transformer architecture to solve the MaxSAT problem.
arXiv Detail & Related papers (2021-07-15T04:47:35Z) - Fitting the Search Space of Weight-sharing NAS with Graph Convolutional
Networks [100.14670789581811]
We train a graph convolutional network to fit the performance of sampled sub-networks.
With this strategy, we achieve a higher rank correlation coefficient in the selected set of candidates.
arXiv Detail & Related papers (2020-04-17T19:12:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.