Neural Circuit Synthesis from Specification Patterns
- URL: http://arxiv.org/abs/2107.11864v1
- Date: Sun, 25 Jul 2021 18:17:33 GMT
- Title: Neural Circuit Synthesis from Specification Patterns
- Authors: Frederik Schmitt, Christopher Hahn, Markus N. Rabe and Bernd
Finkbeiner
- Abstract summary: We train hierarchical Transformers on the task of synthesizing hardware circuits directly out of high-level logical specifications.
New approaches using machine learning might open a lot of possibilities in this area, but suffer from the lack of sufficient amounts of training data.
We show that hierarchical Transformers trained on this synthetic data solve a significant portion of problems from the synthesis competitions.
- Score: 5.7923858184309385
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We train hierarchical Transformers on the task of synthesizing hardware
circuits directly out of high-level logical specifications in linear-time
temporal logic (LTL). The LTL synthesis problem is a well-known algorithmic
challenge with a long history and an annual competition is organized to track
the improvement of algorithms and tooling over time. New approaches using
machine learning might open a lot of possibilities in this area, but suffer
from the lack of sufficient amounts of training data. In this paper, we
consider a method to generate large amounts of additional training data, i.e.,
pairs of specifications and circuits implementing them. We ensure that this
synthetic data is sufficiently close to human-written specifications by mining
common patterns from the specifications used in the synthesis competitions. We
show that hierarchical Transformers trained on this synthetic data solve a
significant portion of problems from the synthesis competitions, and even
out-of-distribution examples from a recent case study.
Related papers
- Genetic Instruct: Scaling up Synthetic Generation of Coding Instructions for Large Language Models [54.51932175059004]
We introduce a scalable method for generating synthetic instructions to enhance the code generation capability of Large Language Models.
The proposed algorithm, Genetic-Instruct, mimics evolutionary processes, utilizing self-instruction to create numerous synthetic samples from a limited number of seeds.
arXiv Detail & Related papers (2024-07-29T20:42:59Z) - Synthetic Oversampling: Theory and A Practical Approach Using LLMs to Address Data Imbalance [16.047084318753377]
Imbalanced data and spurious correlations are common challenges in machine learning and data science.
Oversampling, which artificially increases the number of instances in the underrepresented classes, has been widely adopted to tackle these challenges.
We introduce OPAL, a systematic oversampling approach that leverages the capabilities of large language models to generate high-quality synthetic data for minority groups.
arXiv Detail & Related papers (2024-06-05T21:24:26Z) - Retrieval-Guided Reinforcement Learning for Boolean Circuit Minimization [23.075466444266528]
This study conducts a thorough examination of learning and search techniques for logic synthesis.
We present ABC-RL, a meticulously tuned $alpha$ parameter that adeptly adjusts recommendations from pre-trained agents during the search process.
Our findings showcase substantial enhancements in the Quality-of-result (QoR) of synthesized circuits, boasting improvements of up to 24.8% compared to state-of-the-art techniques.
arXiv Detail & Related papers (2024-01-22T18:46:30Z) - Transformers as Statisticians: Provable In-Context Learning with
In-Context Algorithm Selection [88.23337313766353]
This work first provides a comprehensive statistical theory for transformers to perform ICL.
We show that transformers can implement a broad class of standard machine learning algorithms in context.
A emphsingle transformer can adaptively select different base ICL algorithms.
arXiv Detail & Related papers (2023-06-07T17:59:31Z) - End-to-End Meta-Bayesian Optimisation with Transformer Neural Processes [52.818579746354665]
This paper proposes the first end-to-end differentiable meta-BO framework that generalises neural processes to learn acquisition functions via transformer architectures.
We enable this end-to-end framework with reinforcement learning (RL) to tackle the lack of labelled acquisition data.
arXiv Detail & Related papers (2023-05-25T10:58:46Z) - INVICTUS: Optimizing Boolean Logic Circuit Synthesis via Synergistic
Learning and Search [18.558280701880136]
State-of-the-art logic synthesis algorithms have a large number of logic minimizations.
INVICTUS generates a sequence of logic minimizations based on a training dataset of previously seen designs.
arXiv Detail & Related papers (2023-05-22T15:50:42Z) - Iterative Circuit Repair Against Formal Specifications [3.7277730514654555]
We present a deep learning approach for repairing sequential circuits against formal specifications given in linear-time temporal logic (LTL)
We propose a separated hierarchical Transformer for multimodal representation learning of the formal specification and the circuit.
Our proposed repair mechanism significantly improves the automated synthesis of circuits from specifications with Transformers.
arXiv Detail & Related papers (2023-03-02T11:05:10Z) - Compositional Generalization and Decomposition in Neural Program
Synthesis [59.356261137313275]
In this paper, we focus on measuring the ability of learned program synthesizers to compositionally generalize.
We first characterize several different axes along which program synthesis methods would be desired to generalize.
We introduce a benchmark suite of tasks to assess these abilities based on two popular existing datasets.
arXiv Detail & Related papers (2022-04-07T22:16:05Z) - Creating Synthetic Datasets via Evolution for Neural Program Synthesis [77.34726150561087]
We show that some program synthesis approaches generalize poorly to data distributions different from that of the randomly generated examples.
We propose a new, adversarial approach to control the bias of synthetic data distributions and show that it outperforms current approaches.
arXiv Detail & Related papers (2020-03-23T18:34:15Z) - Towards Neural-Guided Program Synthesis for Linear Temporal Logic
Specifications [26.547133495699093]
We use a neural network to learn a Q-function that is then used to guide search, and to construct programs that are subsequently verified for correctness.
Our method is unique in combining search with deep learning to realize synthesis.
arXiv Detail & Related papers (2019-12-31T17:09:49Z) - Synthetic Datasets for Neural Program Synthesis [66.20924952964117]
We propose a new methodology for controlling and evaluating the bias of synthetic data distributions over both programs and specifications.
We demonstrate, using the Karel DSL and a small Calculator DSL, that training deep networks on these distributions leads to improved cross-distribution generalization performance.
arXiv Detail & Related papers (2019-12-27T21:28:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.