Circuit Transformer: End-to-end Circuit Design by Predicting the Next Gate
- URL: http://arxiv.org/abs/2403.13838v1
- Date: Thu, 14 Mar 2024 03:24:14 GMT
- Title: Circuit Transformer: End-to-end Circuit Design by Predicting the Next Gate
- Authors: Xihan Li, Xing Li, Lei Chen, Xing Zhang, Mingxuan Yuan, Jun Wang,
- Abstract summary: Language, a prominent human ability to express through sequential symbols, has been computationally mastered by recent advances of large language models (LLMs)
LLMs have shown unprecedented capabilities in understanding and reasoning.
Can circuits also be mastered by a a sufficiently large "circuit model", which can conquer electronic design tasks by simply predicting the next logic gate?
- Score: 20.8279111910994
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Language, a prominent human ability to express through sequential symbols, has been computationally mastered by recent advances of large language models (LLMs). By predicting the next word recurrently with huge neural models, LLMs have shown unprecedented capabilities in understanding and reasoning. Circuit, as the "language" of electronic design, specifies the functionality of an electronic device by cascade connections of logic gates. Then, can circuits also be mastered by a a sufficiently large "circuit model", which can conquer electronic design tasks by simply predicting the next logic gate? In this work, we take the first step to explore such possibilities. Two primary barriers impede the straightforward application of LLMs to circuits: their complex, non-sequential structure, and the intolerance of hallucination due to strict constraints (e.g., equivalence). For the first barrier, we encode a circuit as a memory-less, depth-first traversal trajectory, which allows Transformer-based neural models to better leverage its structural information, and predict the next gate on the trajectory as a circuit model. For the second barrier, we introduce an equivalence-preserving decoding process, which ensures that every token in the generated trajectory adheres to the specified equivalence constraints. Moreover, the circuit model can also be regarded as a stochastic policy to tackle optimization-oriented circuit design tasks. Experimentally, we trained a Transformer-based model of 88M parameters, named "Circuit Transformer", which demonstrates impressive performance in end-to-end logic synthesis. With Monte-Carlo tree search, Circuit Transformer significantly improves over resyn2 while retaining strict equivalence, showcasing the potential of generative AI in conquering electronic design challenges.
Related papers
- Algorithmic Capabilities of Random Transformers [49.73113518329544]
We investigate what functions can be learned by randomly transformers in which only the embedding layers are optimized.
We find that these random transformers can perform a wide range of meaningful algorithmic tasks.
Our results indicate that some algorithmic capabilities are present in transformers even before these models are trained.
arXiv Detail & Related papers (2024-10-06T06:04:23Z) - ShortCircuit: AlphaZero-Driven Circuit Design [12.3162550019215]
Chip design relies heavily on generating circuits, such as AND-Inverter Graphs (AIGs) from functional descriptions like truth tables.
Recent advances in deep learning have aimed to accelerate circuit design, but these efforts have mostly focused on tasks other than synthesis.
We introduce ShortCircuit, a novel transformer-based architecture that leverages the structural properties of AIGs and performs efficient space exploration.
arXiv Detail & Related papers (2024-08-19T10:03:14Z) - LaMAGIC: Language-Model-based Topology Generation for Analog Integrated Circuits [17.002169206594793]
We introduce LaMAGIC, a pioneering language model-based topology generation model.
LaMAGIC can efficiently generate an optimized circuit design from the custom specification in a single pass.
LaMAGIC achieves a success rate of up to 96% under a strict tolerance of 0.01.
arXiv Detail & Related papers (2024-07-19T22:51:41Z) - Logic Synthesis with Generative Deep Neural Networks [20.8279111910994]
We introduce a logic synthesis rewriting operator based on the Circuit Transformer model, named "ctrw" (Circuit Transformer Rewriting)
We propose two-stage training scheme for the Circuit Transformer tailored for logic, with iterative improvement of optimality through self-improvement training.
We also integrate the Circuit Transformer with state-of-the-art rewriting techniques to address scalability issues, allowing for guided DAG-aware rewriting.
arXiv Detail & Related papers (2024-06-07T07:16:40Z) - CktGNN: Circuit Graph Neural Network for Electronic Design Automation [67.29634073660239]
This paper presents a Circuit Graph Neural Network (CktGNN) that simultaneously automates the circuit topology generation and device sizing.
We introduce Open Circuit Benchmark (OCB), an open-sourced dataset that contains $10$K distinct operational amplifiers.
Our work paves the way toward a learning-based open-sourced design automation for analog circuits.
arXiv Detail & Related papers (2023-08-31T02:20:25Z) - Transformers as Statisticians: Provable In-Context Learning with
In-Context Algorithm Selection [88.23337313766353]
This work first provides a comprehensive statistical theory for transformers to perform ICL.
We show that transformers can implement a broad class of standard machine learning algorithms in context.
A emphsingle transformer can adaptively select different base ICL algorithms.
arXiv Detail & Related papers (2023-06-07T17:59:31Z) - Graph Neural Network Autoencoders for Efficient Quantum Circuit
Optimisation [69.43216268165402]
We present for the first time how to use graph neural network (GNN) autoencoders for the optimisation of quantum circuits.
We construct directed acyclic graphs from the quantum circuits, encode the graphs and use the encodings to represent RL states.
Our method is the first realistic first step towards very large scale RL quantum circuit optimisation.
arXiv Detail & Related papers (2023-03-06T16:51:30Z) - Resilient superconducting-element design with genetic algorithms [3.53163169498295]
Superconducting quantum circuits exhibit atomic energy spectrum and selection rules as ladder and three-level fabrication.
We show that our multi-loop circuits are robust under random fluctuation in their circuital parameters, i.e. under eventual flaws.
arXiv Detail & Related papers (2023-02-03T16:24:35Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Hardware-Encoding Grid States in a Non-Reciprocal Superconducting
Circuit [62.997667081978825]
We present a circuit design composed of a non-reciprocal device and Josephson junctions whose ground space is doubly degenerate and the ground states are approximate codewords of the Gottesman-Kitaev-Preskill (GKP) code.
We find that the circuit is naturally protected against the common noise channels in superconducting circuits, such as charge and flux noise, implying that it can be used for passive quantum error correction.
arXiv Detail & Related papers (2020-02-18T16:45:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.