An efficient constraint based framework forhandling floating point SMT
problems
- URL: http://arxiv.org/abs/2002.12441v1
- Date: Thu, 27 Feb 2020 21:11:22 GMT
- Title: An efficient constraint based framework forhandling floating point SMT
problems
- Authors: Heytem Zitoun, Claude Michel, Laurent Michel, Michel Rueher
- Abstract summary: This paper introduces the 2019 version of us, a novel Constraint Programming framework for floating point verification problems.
In us, constraints over the floats are first class objects, and the purpose is to expose and exploit structures of floating point domains.
- Score: 0.5161531917413706
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper introduces the 2019 version of \us{}, a novel Constraint
Programming framework for floating point verification problems expressed with
the SMT language of SMTLIB. SMT solvers decompose their task by delegating to
specific theories (e.g., floating point, bit vectors, arrays, ...) the task to
reason about combinatorial or otherwise complex constraints for which the SAT
encoding would be cumbersome or ineffective. This decomposition and encoding
processes lead to the obfuscation of the high-level constraints and a loss of
information on the structure of the combinatorial model. In \us{}, constraints
over the floats are first class objects, and the purpose is to expose and
exploit structures of floating point domains to enhance the search process. A
symbolic phase rewrites each SMTLIB instance to elementary constraints, and
eliminates auxiliary variables whose presence is counterproductive. A
diversification technique within the search steers it away from costly
enumerations in unproductive areas of the search space. The empirical
evaluation demonstrates that the 2019 version of \us{} is competitive on
computationally challenging floating point benchmarks that induce significant
search efforts even for other CP solvers. It highlights that the ability to
harness both inference and search is critical. Indeed, it yields a factor 3
improvement over Colibri and is up to 10 times faster than SMT solvers. The
evaluation was conducted over 214 benchmarks (The Griggio suite) which is a
standard within SMTLIB.
Related papers
- Syzygy of Thoughts: Improving LLM CoT with the Minimal Free Resolution [59.39066657300045]
Chain-of-Thought (CoT) prompting enhances the reasoning of large language models (LLMs) by decomposing problems into sequential steps.
We propose Syzygy of Thoughts (SoT)-a novel framework that extends CoT by introducing auxiliary, interrelated reasoning paths.
SoT captures deeper logical dependencies, enabling more robust and structured problem-solving.
arXiv Detail & Related papers (2025-04-13T13:35:41Z) - Fast Controlled Generation from Language Models with Adaptive Weighted Rejection Sampling [90.86991492288487]
evaluating constraint on every token can be prohibitively expensive.
LCD can distort the global distribution over strings, sampling tokens based only on local information.
We show that our approach is superior to state-of-the-art baselines.
arXiv Detail & Related papers (2025-04-07T18:30:18Z) - To CoT or not to CoT? Chain-of-thought helps mainly on math and symbolic reasoning [55.52872152909785]
Chain-of-thought (CoT) via prompting is the de facto method for eliciting reasoning capabilities from large language models (LLMs)
We show that CoT gives strong performance benefits primarily on tasks involving math or logic, with much smaller gains on other types of tasks.
arXiv Detail & Related papers (2024-09-18T17:55:00Z) - Efficient Inverted Indexes for Approximate Retrieval over Learned Sparse Representations [8.796275989527054]
We propose a novel organization of the inverted index that enables fast retrieval over learned sparse embeddings.
Our approach organizes inverted lists into geometrically-cohesive blocks, each equipped with a summary vector.
Our results indicate that Seismic is one to two orders of magnitude faster than state-of-the-art inverted index-based solutions.
arXiv Detail & Related papers (2024-04-29T15:49:27Z) - Rethinking Few-shot 3D Point Cloud Semantic Segmentation [62.80639841429669]
This paper revisits few-shot 3D point cloud semantic segmentation (FS-PCS)
We focus on two significant issues in the state-of-the-art: foreground leakage and sparse point distribution.
To address these issues, we introduce a standardized FS-PCS setting, upon which a new benchmark is built.
arXiv Detail & Related papers (2024-03-01T15:14:47Z) - Revisiting Evaluation Metrics for Semantic Segmentation: Optimization
and Evaluation of Fine-grained Intersection over Union [113.20223082664681]
We propose the use of fine-grained mIoUs along with corresponding worst-case metrics.
These fine-grained metrics offer less bias towards large objects, richer statistical information, and valuable insights into model and dataset auditing.
Our benchmark study highlights the necessity of not basing evaluations on a single metric and confirms that fine-grained mIoUs reduce the bias towards large objects.
arXiv Detail & Related papers (2023-10-30T03:45:15Z) - Contextual Stochastic Bilevel Optimization [50.36775806399861]
We introduce contextual bilevel optimization (CSBO) -- a bilevel optimization framework with the lower-level problem minimizing an expectation on some contextual information and the upper-level variable.
It is important for applications such as meta-learning, personalized learning, end-to-end learning, and Wasserstein distributionally robustly optimization with side information (WDRO-SI)
arXiv Detail & Related papers (2023-10-27T23:24:37Z) - A Block Coordinate Descent Method for Nonsmooth Composite Optimization under Orthogonality Constraints [7.9047096855236125]
Nonsmooth composite optimization with inequality constraints is crucial in statistical learning and data science.
textbfOBCD is a feasible small computational footprint method to address these challenges.
We prove that textbfOBCD converges to block$k$ stationary points, which offer stronger optimality than standard critical points.
arXiv Detail & Related papers (2023-04-07T13:44:59Z) - Global and Preference-based Optimization with Mixed Variables using Piecewise Affine Surrogates [0.6083861980670925]
This paper proposes a novel surrogate-based global optimization algorithm to solve linearly constrained mixed-variable problems.
We assume the objective function is black-box and expensive-to-evaluate, while the linear constraints are quantifiable unrelaxable a priori known.
We introduce two types of exploration functions to efficiently search the feasible domain via mixed-integer linear programming solvers.
arXiv Detail & Related papers (2023-02-09T15:04:35Z) - Multi-block-Single-probe Variance Reduced Estimator for Coupled
Compositional Optimization [49.58290066287418]
We propose a novel method named Multi-block-probe Variance Reduced (MSVR) to alleviate the complexity of compositional problems.
Our results improve upon prior ones in several aspects, including the order of sample complexities and dependence on strongity.
arXiv Detail & Related papers (2022-07-18T12:03:26Z) - Local Stochastic Bilevel Optimization with Momentum-Based Variance
Reduction [104.41634756395545]
We study Federated Bilevel Optimization problems. Specifically, we first propose the FedBiO, a deterministic gradient-based algorithm.
We show FedBiO has complexity of $O(epsilon-1.5)$.
Our algorithms show superior performances compared to other baselines in numerical experiments.
arXiv Detail & Related papers (2022-05-03T16:40:22Z) - Minimization of Stochastic First-order Oracle Complexity of Adaptive
Methods for Nonconvex Optimization [0.0]
We prove that there exist critical batch sizes in the sense of minimizing the lower and upper bounds of the first-order oracle (SFO) complexity.
We also discuss the conditions needed for the SFO complexity to fit the lower and upper bounds and provide numerical results that support our theoretical results.
arXiv Detail & Related papers (2021-12-14T04:55:04Z) - Recursive Causal Structure Learning in the Presence of Latent Variables
and Selection Bias [27.06618125828978]
We consider the problem of learning the causal MAG of a system from observational data in the presence of latent variables and selection bias.
We propose a novel computationally efficient constraint-based method that is sound and complete.
We provide experimental results to compare the proposed approach with the state of the art on both synthetic and real-world structures.
arXiv Detail & Related papers (2021-10-22T19:49:59Z) - Decomposed Quadratization: Efficient QUBO Formulation for Learning Bayesian Network [0.0]
It is essential to minimize the number of binary variables used in the objective function.
We propose a QUBO formulation that offers a bit capacity advantage over conventional quadratization techniques.
arXiv Detail & Related papers (2020-06-12T03:19:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.