Circuit Transformer: End-to-end Circuit Design by Predicting the Next Gate
- URL: http://arxiv.org/abs/2403.13838v1
- Date: Thu, 14 Mar 2024 03:24:14 GMT
- Title: Circuit Transformer: End-to-end Circuit Design by Predicting the Next Gate
- Authors: Xihan Li, Xing Li, Lei Chen, Xing Zhang, Mingxuan Yuan, Jun Wang,
- Abstract summary: Language, a prominent human ability to express through sequential symbols, has been computationally mastered by recent advances of large language models (LLMs)
LLMs have shown unprecedented capabilities in understanding and reasoning.
Can circuits also be mastered by a a sufficiently large "circuit model", which can conquer electronic design tasks by simply predicting the next logic gate?
- Score: 20.8279111910994
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Language, a prominent human ability to express through sequential symbols, has been computationally mastered by recent advances of large language models (LLMs). By predicting the next word recurrently with huge neural models, LLMs have shown unprecedented capabilities in understanding and reasoning. Circuit, as the "language" of electronic design, specifies the functionality of an electronic device by cascade connections of logic gates. Then, can circuits also be mastered by a a sufficiently large "circuit model", which can conquer electronic design tasks by simply predicting the next logic gate? In this work, we take the first step to explore such possibilities. Two primary barriers impede the straightforward application of LLMs to circuits: their complex, non-sequential structure, and the intolerance of hallucination due to strict constraints (e.g., equivalence). For the first barrier, we encode a circuit as a memory-less, depth-first traversal trajectory, which allows Transformer-based neural models to better leverage its structural information, and predict the next gate on the trajectory as a circuit model. For the second barrier, we introduce an equivalence-preserving decoding process, which ensures that every token in the generated trajectory adheres to the specified equivalence constraints. Moreover, the circuit model can also be regarded as a stochastic policy to tackle optimization-oriented circuit design tasks. Experimentally, we trained a Transformer-based model of 88M parameters, named "Circuit Transformer", which demonstrates impressive performance in end-to-end logic synthesis. With Monte-Carlo tree search, Circuit Transformer significantly improves over resyn2 while retaining strict equivalence, showcasing the potential of generative AI in conquering electronic design challenges.
Related papers
- Architect of the Bits World: Masked Autoregressive Modeling for Circuit Generation Guided by Truth Table [5.300504429005315]
We propose a novel approach integrating conditional generative models with differentiable architecture search (DAS) for circuit generation.
Our approach first introduces CircuitVQ, a circuit tokenizer trained based on our Circuit AutoEncoder.
We then develop CircuitAR, a masked autoregressive model leveraging CircuitVQ as the tokenizer.
arXiv Detail & Related papers (2025-02-18T11:13:03Z) - Emergent unitary designs for encoded qubits from coherent errors and syndrome measurements [1.8854166566682866]
We propose an efficient approach to generate unitary designs for encoded qubits in surface codes.
We numerically show that the ensemble of logical unitaries converges to a unitary design in the thermodynamic limit.
Our results provide a practical way to realize unitary designs on encoded qubits.
arXiv Detail & Related papers (2024-12-05T18:36:14Z) - Algorithmic Capabilities of Random Transformers [49.73113518329544]
We investigate what functions can be learned by randomly transformers in which only the embedding layers are optimized.
We find that these random transformers can perform a wide range of meaningful algorithmic tasks.
Our results indicate that some algorithmic capabilities are present in transformers even before these models are trained.
arXiv Detail & Related papers (2024-10-06T06:04:23Z) - Logic Synthesis with Generative Deep Neural Networks [20.8279111910994]
We introduce a logic synthesis rewriting operator based on the Circuit Transformer model, named "ctrw" (Circuit Transformer Rewriting)
We propose two-stage training scheme for the Circuit Transformer tailored for logic, with iterative improvement of optimality through self-improvement training.
We also integrate the Circuit Transformer with state-of-the-art rewriting techniques to address scalability issues, allowing for guided DAG-aware rewriting.
arXiv Detail & Related papers (2024-06-07T07:16:40Z) - Benchmarking logical three-qubit quantum Fourier transform encoded in the Steane code on a trapped-ion quantum computer [3.2821436094760026]
We logically encoded three-qubit circuits for the quantum transform (QFT)
We benchmark the circuits on the Quantinuum H2-11 trapped-ion quantum computer.
We compare the logical QFT benchmark results to predictions based on the logical component benchmarks.
arXiv Detail & Related papers (2024-04-12T17:27:27Z) - CktGNN: Circuit Graph Neural Network for Electronic Design Automation [67.29634073660239]
This paper presents a Circuit Graph Neural Network (CktGNN) that simultaneously automates the circuit topology generation and device sizing.
We introduce Open Circuit Benchmark (OCB), an open-sourced dataset that contains $10$K distinct operational amplifiers.
Our work paves the way toward a learning-based open-sourced design automation for analog circuits.
arXiv Detail & Related papers (2023-08-31T02:20:25Z) - Adaptive Planning Search Algorithm for Analog Circuit Verification [53.97809573610992]
We propose a machine learning (ML) approach, which uses less simulations.
We show that the proposed approach is able to provide OCCs closer to the specifications for all circuits.
arXiv Detail & Related papers (2023-06-23T12:57:46Z) - Transformers as Statisticians: Provable In-Context Learning with
In-Context Algorithm Selection [88.23337313766353]
This work first provides a comprehensive statistical theory for transformers to perform ICL.
We show that transformers can implement a broad class of standard machine learning algorithms in context.
A emphsingle transformer can adaptively select different base ICL algorithms.
arXiv Detail & Related papers (2023-06-07T17:59:31Z) - Neural Combinatorial Logic Circuit Synthesis from Input-Output Examples [10.482805367361818]
We propose a novel, fully explainable neural approach to synthesis of logic circuits from input-output examples.
Our method can be employed for a virtually arbitrary choice of atoms.
arXiv Detail & Related papers (2022-10-29T14:06:42Z) - Training End-to-End Analog Neural Networks with Equilibrium Propagation [64.0476282000118]
We introduce a principled method to train end-to-end analog neural networks by gradient descent.
We show mathematically that a class of analog neural networks (called nonlinear resistive networks) are energy-based models.
Our work can guide the development of a new generation of ultra-fast, compact and low-power neural networks supporting on-chip learning.
arXiv Detail & Related papers (2020-06-02T23:38:35Z) - Hardware-Encoding Grid States in a Non-Reciprocal Superconducting
Circuit [62.997667081978825]
We present a circuit design composed of a non-reciprocal device and Josephson junctions whose ground space is doubly degenerate and the ground states are approximate codewords of the Gottesman-Kitaev-Preskill (GKP) code.
We find that the circuit is naturally protected against the common noise channels in superconducting circuits, such as charge and flux noise, implying that it can be used for passive quantum error correction.
arXiv Detail & Related papers (2020-02-18T16:45:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.