Weighted Context-Free-Language Ordered Binary Decision Diagrams
- URL: http://arxiv.org/abs/2305.13610v2
- Date: Wed, 28 Aug 2024 19:06:27 GMT
- Title: Weighted Context-Free-Language Ordered Binary Decision Diagrams
- Authors: Meghana Sistla, Swarat Chaudhuri, Thomas Reps,
- Abstract summary: This paper presents a new data structure, called emphWeighted Context-Free-Language Ordered BDDs (WCFLOBDDs)
For some functions, WCFLOBDDs are exponentially more succinct than WBDDs.
We apply WCFLOBDDs in quantum-circuit simulation, and find that they perform better than WBDDs on certain benchmarks.
- Score: 9.483290784772011
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a new data structure, called \emph{Weighted Context-Free-Language Ordered BDDs} (WCFLOBDDs), which are a hierarchically structured decision diagram, akin to Weighted BDDs (WBDDs) enhanced with a procedure-call mechanism. For some functions, WCFLOBDDs are exponentially more succinct than WBDDs. They are potentially beneficial for representing functions of type $\mathbb{B}^n \rightarrow D$, when a function's image $V \subseteq D$ has many different values. We apply WCFLOBDDs in quantum-circuit simulation, and find that they perform better than WBDDs on certain benchmarks. With a 15-minute timeout, the number of qubits that can be handled by WCFLOBDDs is 1-64$\times$ that of WBDDs (and 1-128$\times$ that of CFLOBDDs, which are an unweighted version of WCFLOBDDs). These results support the conclusion that for this application -- from the standpoint of problem size, measured as the number of qubits -- WCFLOBDDs provide the best of both worlds: performance roughly matches whichever of WBDDs and CFLOBDDs is better. (From the standpoint of running time, the results are more nuanced.)
Related papers
- WDD: Weighted Delta Debugging [6.393194328016689]
Delta Weighted Delta (WDD) is a novel concept to help prior delta debug algorithms overcome limitations.
WDD assigns each element in the list a weight according to its size, and distinguish different elements based on their weights during partitioning.
We extensively evaluated WDD in two representative applications, HDD and Perses, on 62 benchmarks across two languages.
arXiv Detail & Related papers (2024-11-28T23:31:33Z) - Exploiting Pre-trained Models for Drug Target Affinity Prediction with Nearest Neighbors [58.661454334877256]
Drug-Target binding Affinity (DTA) prediction is essential for drug discovery.
Despite the application of deep learning methods to DTA prediction, the achieved accuracy remain suboptimal.
We propose $k$NN-DTA, a non-representation embedding-based retrieval method adopted on a pre-trained DTA prediction model.
arXiv Detail & Related papers (2024-07-21T15:49:05Z) - Buffer of Thoughts: Thought-Augmented Reasoning with Large Language Models [65.48185395952788]
Buffer of Thoughts (BoT) is a novel and versatile thought-augmented reasoning approach.
We propose meta-buffer to store a series of informative high-level thoughts.
For each problem, we retrieve a relevant thought-template and adaptively instantiate it with specific reasoning structures.
arXiv Detail & Related papers (2024-06-06T17:22:08Z) - Variants of Tagged Sentential Decision Diagrams [6.891570850234007]
A recently proposed canonical form of Boolean functions, namely tagged sentential decision diagrams (TSDDs), exploits both the standard and zero-suppressed trimming rules.
In this paper, we present a variant of TSDDs which we call standard TSDDs (STSDDs) by reversing the order of trimming rules.
We then prove the canonicity of STSDDs and present the algorithms for binary operations on TSDDs.
arXiv Detail & Related papers (2023-11-16T12:29:25Z) - Exploiting Asymmetry in Logic Puzzles: Using ZDDs for Symbolic Model
Checking Dynamic Epistemic Logic [0.0]
We use ZDDs to symbolically encode Kripke models used in Dynamic Epistemic Logic.
We show that replacing BDDs with the right variant of ZDDs can significantly reduce memory usage.
This suggests that ZDDs are a useful tool for model checking multi-agent systems.
arXiv Detail & Related papers (2023-07-11T07:13:09Z) - Direct Diffusion Bridge using Data Consistency for Inverse Problems [65.04689839117692]
Diffusion model-based inverse problem solvers have shown impressive performance, but are limited in speed.
Several recent works have tried to alleviate this problem by building a diffusion process, directly bridging the clean and the corrupted.
We propose a modified inference procedure that imposes data consistency without the need for fine-tuning.
arXiv Detail & Related papers (2023-05-31T12:51:10Z) - Belief Revision in Sentential Decision Diagrams [126.94029917018733]
We develop a general revision algorithm for SDDs based on a syntactic characterisation of Dalal revision.
Preliminary experiments performed with randomly generated knowledge bases show the advantages of directly perform revision within SDD formalism.
arXiv Detail & Related papers (2022-01-20T11:01:41Z) - ADAPT: Mitigating Idling Errors in Qubits via Adaptive Dynamical
Decoupling [3.2505361717998227]
A qubit is susceptible to idling errors, which occur when the qubit is idle and not actively undergoing any operations.
Existing Dynamical Decoupling protocols have been primarily studied for individual qubits.
We propose Adaptive Dynamical Decoupling (ADAPT), a software framework that estimates the efficacy of DD for each qubit combination.
arXiv Detail & Related papers (2021-09-11T16:15:24Z) - Double Forward Propagation for Memorized Batch Normalization [68.34268180871416]
Batch Normalization (BN) has been a standard component in designing deep neural networks (DNNs)
We propose a memorized batch normalization (MBN) which considers multiple recent batches to obtain more accurate and robust statistics.
Compared to related methods, the proposed MBN exhibits consistent behaviors in both training and inference.
arXiv Detail & Related papers (2020-10-10T08:48:41Z) - DynaBERT: Dynamic BERT with Adaptive Width and Depth [55.18269622415814]
We propose a novel dynamic BERT model (abbreviated as DynaBERT)
It can flexibly adjust the size and latency by selecting adaptive width and depth.
It consistently outperforms existing BERT compression methods.
arXiv Detail & Related papers (2020-04-08T15:06:28Z) - Variable Shift SDD: A More Succinct Sentential Decision Diagram [10.91026094237478]
Sentential Decision Diagram (SDD) is a tractable representation of Boolean functions.
We propose a more succinct variant of SDD named Variable Shift SDD (VS-SDD)
We show that VS-SDDs are never larger than SDDs and there are cases in which the size of a VS-SDD is exponentially smaller than that of an SDD.
arXiv Detail & Related papers (2020-04-06T09:18:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.