Circuit Transformer: A Transformer That Preserves Logical Equivalence
- URL: http://arxiv.org/abs/2403.13838v2
- Date: Sun, 02 Feb 2025 03:55:55 GMT
- Title: Circuit Transformer: A Transformer That Preserves Logical Equivalence
- Authors: Xihan Li, Xing Li, Lei Chen, Xing Zhang, Mingxuan Yuan, Jun Wang,
- Abstract summary: We introduce a generative neural model, the "Circuit Transformer", which produces logic circuits strictly equivalent to given Boolean functions.
A Markov decision process formulation is also proposed for optimizing certain objectives of circuits.
- Score: 20.8279111910994
- License:
- Abstract: Implementing Boolean functions with circuits consisting of logic gates is fundamental in digital computer design. However, the implemented circuit must be exactly equivalent, which hinders generative neural approaches on this task due to their occasionally wrong predictions. In this study, we introduce a generative neural model, the "Circuit Transformer", which eliminates such wrong predictions and produces logic circuits strictly equivalent to given Boolean functions. The main idea is a carefully designed decoding mechanism that builds a circuit step-by-step by generating tokens, which has beneficial "cutoff properties" that block a candidate token once it invalidate equivalence. In such a way, the proposed model works similar to typical LLMs while logical equivalence is strictly preserved. A Markov decision process formulation is also proposed for optimizing certain objectives of circuits. Experimentally, we trained an 88-million-parameter Circuit Transformer to generate equivalent yet more compact forms of input circuits, outperforming existing neural approaches on both synthetic and real world benchmarks, without any violation of equivalence constraints.
Related papers
- Architect of the Bits World: Masked Autoregressive Modeling for Circuit Generation Guided by Truth Table [5.300504429005315]
We propose a novel approach integrating conditional generative models with differentiable architecture search (DAS) for circuit generation.
Our approach first introduces CircuitVQ, a circuit tokenizer trained based on our Circuit AutoEncoder.
We then develop CircuitAR, a masked autoregressive model leveraging CircuitVQ as the tokenizer.
arXiv Detail & Related papers (2025-02-18T11:13:03Z) - Emergent unitary designs for encoded qubits from coherent errors and syndrome measurements [1.8854166566682866]
We propose an efficient approach to generate unitary designs for encoded qubits in surface codes.
We numerically show that the ensemble of logical unitaries converges to a unitary design in the thermodynamic limit.
Our results provide a practical way to realize unitary designs on encoded qubits.
arXiv Detail & Related papers (2024-12-05T18:36:14Z) - Algorithmic Capabilities of Random Transformers [49.73113518329544]
We investigate what functions can be learned by randomly transformers in which only the embedding layers are optimized.
We find that these random transformers can perform a wide range of meaningful algorithmic tasks.
Our results indicate that some algorithmic capabilities are present in transformers even before these models are trained.
arXiv Detail & Related papers (2024-10-06T06:04:23Z) - Logic Synthesis with Generative Deep Neural Networks [20.8279111910994]
We introduce a logic synthesis rewriting operator based on the Circuit Transformer model, named "ctrw" (Circuit Transformer Rewriting)
We propose two-stage training scheme for the Circuit Transformer tailored for logic, with iterative improvement of optimality through self-improvement training.
We also integrate the Circuit Transformer with state-of-the-art rewriting techniques to address scalability issues, allowing for guided DAG-aware rewriting.
arXiv Detail & Related papers (2024-06-07T07:16:40Z) - Benchmarking logical three-qubit quantum Fourier transform encoded in the Steane code on a trapped-ion quantum computer [3.2821436094760026]
We logically encoded three-qubit circuits for the quantum transform (QFT)
We benchmark the circuits on the Quantinuum H2-11 trapped-ion quantum computer.
We compare the logical QFT benchmark results to predictions based on the logical component benchmarks.
arXiv Detail & Related papers (2024-04-12T17:27:27Z) - CktGNN: Circuit Graph Neural Network for Electronic Design Automation [67.29634073660239]
This paper presents a Circuit Graph Neural Network (CktGNN) that simultaneously automates the circuit topology generation and device sizing.
We introduce Open Circuit Benchmark (OCB), an open-sourced dataset that contains $10$K distinct operational amplifiers.
Our work paves the way toward a learning-based open-sourced design automation for analog circuits.
arXiv Detail & Related papers (2023-08-31T02:20:25Z) - Adaptive Planning Search Algorithm for Analog Circuit Verification [53.97809573610992]
We propose a machine learning (ML) approach, which uses less simulations.
We show that the proposed approach is able to provide OCCs closer to the specifications for all circuits.
arXiv Detail & Related papers (2023-06-23T12:57:46Z) - Transformers as Statisticians: Provable In-Context Learning with
In-Context Algorithm Selection [88.23337313766353]
This work first provides a comprehensive statistical theory for transformers to perform ICL.
We show that transformers can implement a broad class of standard machine learning algorithms in context.
A emphsingle transformer can adaptively select different base ICL algorithms.
arXiv Detail & Related papers (2023-06-07T17:59:31Z) - Neural Combinatorial Logic Circuit Synthesis from Input-Output Examples [10.482805367361818]
We propose a novel, fully explainable neural approach to synthesis of logic circuits from input-output examples.
Our method can be employed for a virtually arbitrary choice of atoms.
arXiv Detail & Related papers (2022-10-29T14:06:42Z) - Training End-to-End Analog Neural Networks with Equilibrium Propagation [64.0476282000118]
We introduce a principled method to train end-to-end analog neural networks by gradient descent.
We show mathematically that a class of analog neural networks (called nonlinear resistive networks) are energy-based models.
Our work can guide the development of a new generation of ultra-fast, compact and low-power neural networks supporting on-chip learning.
arXiv Detail & Related papers (2020-06-02T23:38:35Z) - Hardware-Encoding Grid States in a Non-Reciprocal Superconducting
Circuit [62.997667081978825]
We present a circuit design composed of a non-reciprocal device and Josephson junctions whose ground space is doubly degenerate and the ground states are approximate codewords of the Gottesman-Kitaev-Preskill (GKP) code.
We find that the circuit is naturally protected against the common noise channels in superconducting circuits, such as charge and flux noise, implying that it can be used for passive quantum error correction.
arXiv Detail & Related papers (2020-02-18T16:45:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.