SAT Strikes Back: Parameter and Path Relations in Quantum Toolchains
- URL: http://arxiv.org/abs/2505.22060v1
- Date: Wed, 28 May 2025 07:32:37 GMT
- Title: SAT Strikes Back: Parameter and Path Relations in Quantum Toolchains
- Authors: Lukas Schmidbauer, Wolfgang Mauerer,
- Abstract summary: It is crucial to find (multiple) transformation paths that are optimised for ( hardware specific) metrics.<n>We zoom into this pictured tree of transformations by focussing on k-SAT instances as input and their transformation to QUBO.<n>Our results can be used to rate valid paths of transformation in advance -- also in automated (quantum) toolchains.
- Score: 3.0189109720302207
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In the foreseeable future, toolchains for quantum computing should offer automatic means of transforming a high level problem formulation down to a hardware executable form. Thereby, it is crucial to find (multiple) transformation paths that are optimised for (hardware specific) metrics. We zoom into this pictured tree of transformations by focussing on k-SAT instances as input and their transformation to QUBO, while considering structure and characteristic metrics of input, intermediate and output representations. Our results can be used to rate valid paths of transformation in advance -- also in automated (quantum) toolchains. We support the automation aspect by considering stability and therefore predictability of free parameters and transformation paths. Moreover, our findings can be used in the manifesting era of error correction (since considering structure in a high abstraction layer can benefit error correcting codes in layers below). We also show that current research is closely linked to quadratisation techniques and their mathematical foundation.
Related papers
- On the Existence of Universal Simulators of Attention [17.01811978811789]
We present solutions to identically replicate attention outputs and the underlying elementary matrix and activation operations via RASP.<n>Our proofs, for the first time, show the existence of an algorithmically achievable data-agnostic solution, previously known to be approximated only by learning.
arXiv Detail & Related papers (2025-06-23T15:15:25Z) - An Efficient Quantum Classifier Based on Hamiltonian Representations [50.467930253994155]
Quantum machine learning (QML) is a discipline that seeks to transfer the advantages of quantum computing to data-driven tasks.<n>We propose an efficient approach that circumvents the costs associated with data encoding by mapping inputs to a finite set of Pauli strings.<n>We evaluate our approach on text and image classification tasks, against well-established classical and quantum models.
arXiv Detail & Related papers (2025-04-13T11:49:53Z) - A Scalable Synthesis Algorithm for Reversible Functions [1.3654846342364308]
This paper introduces a transformation-based method for exact synthesis of reversible circuits.<n> Experimental results demonstrate significant improvements over the state-of-the-art exact synthesis methods, achieving up to 99% improvements in terms of the number of levels of T-gates.
arXiv Detail & Related papers (2025-04-03T14:29:33Z) - Transformers to Predict the Applicability of Symbolic Integration Routines [0.0]
We consider how machine learning may be used to optimise this task in a Computer System.
We train transformers that predict whether a particular integration method will be successful, and compare against the existing human-made Algebras.
We find the transformer can outperform these guards, gaining up to 30% accuracy and 70% precision.
arXiv Detail & Related papers (2024-10-31T14:03:37Z) - Circuit Transformer: A Transformer That Preserves Logical Equivalence [20.8279111910994]
We introduce a generative neural model, the "Circuit Transformer", which produces logic circuits strictly equivalent to given Boolean functions.<n>A Markov decision process formulation is also proposed for optimizing certain objectives of circuits.
arXiv Detail & Related papers (2024-03-14T03:24:14Z) - Quantum Fourier Transformation Circuits Compilation [7.1069624340204465]
This research paper focuses on the domain-specific hardware mapping strategy for Quantum Transformation (QFT) circuits.
We adopt a novel approach that combines technical intuition, often referred to as "educated guesses," and sophisticated synthesis program tools.
The groundbreaking outcome of our research is the introduction of the first set of linear-depth transformed QFT circuits designed for Google Sycamore, IBM heavy-hex, and the conventional 2-dimensional (2D) grid configurations.
arXiv Detail & Related papers (2023-12-17T21:26:17Z) - Learning Transformer Programs [78.9509560355733]
We introduce a procedure for training Transformers that are mechanistically interpretable by design.
Instead of compiling human-written programs into Transformers, we design a modified Transformer that can be trained using gradient-based optimization.
The Transformer Programs can automatically find reasonable solutions, performing on par with standard Transformers of comparable size.
arXiv Detail & Related papers (2023-06-01T20:27:01Z) - Vision Transformer with Quadrangle Attention [76.35955924137986]
We propose a novel quadrangle attention (QA) method that extends the window-based attention to a general quadrangle formulation.
Our method employs an end-to-end learnable quadrangle regression module that predicts a transformation matrix to transform default windows into target quadrangles.
We integrate QA into plain and hierarchical vision transformers to create a new architecture named QFormer, which offers minor code modifications and negligible extra computational cost.
arXiv Detail & Related papers (2023-03-27T11:13:50Z) - Suppressing quantum circuit errors due to system variability [0.0]
We present a quantum circuit optimization technique that takes into account the variability in error rates that is inherent across present day noisy quantum computing platforms.
We show that it is possible to recover on average nearly of missing fidelity using better qubit selection via efficient to compute cost functions.
arXiv Detail & Related papers (2022-09-30T15:00:38Z) - Scaling Quantum Approximate Optimization on Near-term Hardware [49.94954584453379]
We quantify scaling of the expected resource requirements by optimized circuits for hardware architectures with varying levels of connectivity.
We show the number of measurements, and hence total time to synthesizing solution, grows exponentially in problem size and problem graph degree.
These problems may be alleviated by increasing hardware connectivity or by recently proposed modifications to the QAOA that achieve higher performance with fewer circuit layers.
arXiv Detail & Related papers (2022-01-06T21:02:30Z) - Statistically Meaningful Approximation: a Case Study on Approximating
Turing Machines with Transformers [50.85524803885483]
This work proposes a formal definition of statistically meaningful (SM) approximation which requires the approximating network to exhibit good statistical learnability.
We study SM approximation for two function classes: circuits and Turing machines.
arXiv Detail & Related papers (2021-07-28T04:28:55Z) - Transformer-based Machine Learning for Fast SAT Solvers and Logic
Synthesis [63.53283025435107]
CNF-based SAT and MaxSAT solvers are central to logic synthesis and verification systems.
In this work, we propose a one-shot model derived from the Transformer architecture to solve the MaxSAT problem.
arXiv Detail & Related papers (2021-07-15T04:47:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.