Logical Guidance for the Exact Composition of Diffusion Models
- URL: http://arxiv.org/abs/2602.05549v1
- Date: Thu, 05 Feb 2026 11:10:06 GMT
- Title: Logical Guidance for the Exact Composition of Diffusion Models
- Authors: Francesco Alesiani, Jonathan Warrell, Tanja Bien, Henrik Christiansen, Matheus Ferraz, Mathias Niepert,
- Abstract summary: LOGDIFF is a framework for principled constrained generation with complex logical expressions at inference time.<n>We show that exact score-based guidance for complex logical formulas can be obtained from guidance signals associated with atomic properties.
- Score: 20.022782701918572
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose LOGDIFF (Logical Guidance for the Exact Composition of Diffusion Models), a guidance framework for diffusion models that enables principled constrained generation with complex logical expressions at inference time. We study when exact score-based guidance for complex logical formulas can be obtained from guidance signals associated with atomic properties. First, we derive an exact Boolean calculus that provides a sufficient condition for exact logical guidance. Specifically, if a formula admits a circuit representation in which conjunctions combine conditionally independent subformulas and disjunctions combine subformulas that are either conditionally independent or mutually exclusive, exact logical guidance is achievable. In this case, the guidance signal can be computed exactly from atomic scores and posterior probabilities using an efficient recursive algorithm. Moreover, we show that, for commonly encountered classes of distributions, any desired Boolean formula is compilable into such a circuit representation. Second, by combining atomic guidance scores with posterior probability estimates, we introduce a hybrid guidance approach that bridges classifierguidance and classifier-free guidance, applicable to both compositional logical guidance and standard conditional generation. We demonstrate the effectiveness of our framework on multiple image and protein structure generation tasks.
Related papers
- LSRIF: Logic-Structured Reinforcement Learning for Instruction Following [56.517329105764475]
We propose a logic-structured training framework LSRIF that explicitly models instruction logic.<n> Experiments show LSRIF brings significant improvements in instruction-following and general reasoning.
arXiv Detail & Related papers (2026-01-10T05:11:38Z) - Modelling Arbitrary Computations in the Symbolic Model using an Equational Theory for Bounded Binary Circuits [0.0]
We propose a class of equational theories for bounded binary circuits with the finite variant property.<n>These theories could serve as a building block to specify cryptographic primitive implementations.
arXiv Detail & Related papers (2025-07-29T12:09:50Z) - Loss-Complexity Landscape and Model Structure Functions [53.92822954974537]
We develop a framework for dualizing the Kolmogorov structure function $h_x(alpha)$.<n>We establish a mathematical analogy between information-theoretic constructs and statistical mechanics.<n>We explicitly prove the Legendre-Fenchel duality between the structure function and free energy.
arXiv Detail & Related papers (2025-07-17T21:31:45Z) - On the Diagram of Thought [20.805936414171892]
Large Language Models (LLMs) excel at many tasks but often falter on complex problems that require structured, multi-step reasoning.<n>We introduce the Diagram of Thought (DoT), a new framework that enables a single LLM to build and navigate a mental map of its reasoning.
arXiv Detail & Related papers (2024-09-16T07:01:41Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Tractable Bounding of Counterfactual Queries by Knowledge Compilation [51.47174989680976]
We discuss the problem of bounding partially identifiable queries, such as counterfactuals, in Pearlian structural causal models.
A recently proposed iterated EM scheme yields an inner approximation of those bounds by sampling the initialisation parameters.
We show how a single symbolic knowledge compilation allows us to obtain the circuit structure with symbolic parameters to be replaced by their actual values.
arXiv Detail & Related papers (2023-10-05T07:10:40Z) - DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained
Diffusion [66.21290235237808]
We introduce an energy constrained diffusion model which encodes a batch of instances from a dataset into evolutionary states.
We provide rigorous theory that implies closed-form optimal estimates for the pairwise diffusion strength among arbitrary instance pairs.
Experiments highlight the wide applicability of our model as a general-purpose encoder backbone with superior performance in various tasks.
arXiv Detail & Related papers (2023-01-23T15:18:54Z) - Differentiable Inference of Temporal Logic Formulas [1.370633147306388]
We demonstrate the first Recurrent Neural Network architecture for learning Signal Temporal Logic formulas.
We present the first systematic comparison of formula inference methods.
arXiv Detail & Related papers (2022-08-10T16:52:23Z) - Adaptive n-ary Activation Functions for Probabilistic Boolean Logic [2.294014185517203]
We show that we can learn arbitrary logic in a single layer using an activation function of matching or greater arity.
We represent belief tables using a basis that directly associates the number of nonzero parameters to the effective arity of the belief function.
This opens optimization approaches to reduce logical complexity by inducing parameter sparsity.
arXiv Detail & Related papers (2022-03-16T22:47:53Z) - Refining Labelled Systems for Modal and Constructive Logics with
Applications [0.0]
This thesis serves as a means of transforming the semantics of a modal and/or constructive logic into an 'economical' proof system.
The refinement method connects two proof-theoretic paradigms: labelled and nested sequent calculi.
The introduced refined labelled calculi will be used to provide the first proof-search algorithms for deontic STIT logics.
arXiv Detail & Related papers (2021-07-30T08:27:15Z) - Structural Learning of Probabilistic Sentential Decision Diagrams under
Partial Closed-World Assumption [127.439030701253]
Probabilistic sentential decision diagrams are a class of structured-decomposable circuits.
We propose a new scheme based on a partial closed-world assumption: data implicitly provide the logical base of the circuit.
Preliminary experiments show that the proposed approach might properly fit training data, and generalize well to test data, provided that these remain consistent with the underlying logical base.
arXiv Detail & Related papers (2021-07-26T12:01:56Z) - A Study of Continuous Vector Representationsfor Theorem Proving [2.0518509649405106]
We develop an encoding that allows for logical properties to be preserved and is additionally reversible.
This means that the tree shape of a formula including all symbols can be reconstructed from the dense vector representation.
We propose datasets that can be used to train these syntactic and semantic properties.
arXiv Detail & Related papers (2021-01-22T15:04:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.