AI Methods for Permutation Circuit Synthesis Across Generic Topologies
- URL: http://arxiv.org/abs/2509.16020v1
- Date: Fri, 19 Sep 2025 14:28:22 GMT
- Title: AI Methods for Permutation Circuit Synthesis Across Generic Topologies
- Authors: Victor Villar, Juan Cruz-Benito, Ismael Faro, David Kremer,
- Abstract summary: Our approach uses Reinforcement Learning (RL) techniques to achieve near-optimal synthesis of permutation circuits up to 25 qubits.<n>We train a foundational model on a generic rectangular lattice, and employ masking mechanisms to dynamically select subsets of topologies during the synthesis.<n>This enables the synthesis of permutation circuits on any topology that can be embedded within the rectangular lattice, without the need to re-train the model.
- Score: 1.0499611180329804
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper investigates artificial intelligence (AI) methodologies for the synthesis and transpilation of permutation circuits across generic topologies. Our approach uses Reinforcement Learning (RL) techniques to achieve near-optimal synthesis of permutation circuits up to 25 qubits. Rather than developing specialized models for individual topologies, we train a foundational model on a generic rectangular lattice, and employ masking mechanisms to dynamically select subsets of topologies during the synthesis. This enables the synthesis of permutation circuits on any topology that can be embedded within the rectangular lattice, without the need to re-train the model. In this paper we show results for 5x5 lattice and compare them to previous AI topology-oriented models and classical methods, showing that they outperform classical heuristics, and match previous specialized AI models, and performs synthesis even for topologies that were not seen during training. We further show that the model can be fine tuned to strengthen the performance for selected topologies of interest. This methodology allows a single trained model to efficiently synthesize circuits across diverse topologies, allowing its practical integration into transpilation workflows.
Related papers
- Data-Driven Dimensional Synthesis of Diverse Planar Four-bar Function Generation Mechanisms via Direct Parameterization [2.499517394718329]
We propose a data-driven framework that bypasses traditional equation-solving and optimization by leveraging supervised learning.<n>Our method combines a synthetic dataset, an LSTM-based neural network for handling sequential precision points, and a Mixture of Experts architecture tailored to different linkage types.<n> Experiments show our approach produces accurate, defect-free linkages across various configurations.
arXiv Detail & Related papers (2025-07-11T02:32:29Z) - STAR: Synthesis of Tailored Architectures [61.080157488857516]
We propose a new approach for the synthesis of tailored architectures (STAR)<n>Our approach combines a novel search space based on the theory of linear input-varying systems, supporting a hierarchical numerical encoding into architecture genomes. STAR genomes are automatically refined and recombined with gradient-free, evolutionary algorithms to optimize for multiple model quality and efficiency metrics.<n>Using STAR, we optimize large populations of new architectures, leveraging diverse computational units and interconnection patterns, improving over highly-optimized Transformers and striped hybrid models on the frontier of quality, parameter size, and inference cache for autoregressive language modeling.
arXiv Detail & Related papers (2024-11-26T18:42:42Z) - AI-Aided Kalman Filters [65.35350122917914]
The Kalman filter (KF) and its variants are among the most celebrated algorithms in signal processing.<n>Recent developments illustrate the possibility of fusing deep neural networks (DNNs) with classic Kalman-type filtering.<n>This article provides a tutorial-style overview of design approaches for incorporating AI in aiding KF-type algorithms.
arXiv Detail & Related papers (2024-10-16T06:47:53Z) - Learnable & Interpretable Model Combination in Dynamical Systems Modeling [0.0]
This work briefly discusses which types of model are usually combined in dynamical systems modeling.<n>We propose a class of models that is capable of expressing mixed algebraic, discrete, and differential equation-based models.<n>Finally, we propose a new wildcard architecture that is capable of describing arbitrary combinations of models in an easy-to-interpret fashion.
arXiv Detail & Related papers (2024-06-12T11:17:11Z) - ECToNAS: Evolutionary Cross-Topology Neural Architecture Search [0.0]
ECToNAS is a cost-efficient evolutionary cross-topology neural architecture search algorithm.
It fuses training and topology optimisation together into one lightweight, resource-friendly process.
arXiv Detail & Related papers (2024-03-08T07:36:46Z) - Transformers as Statisticians: Provable In-Context Learning with
In-Context Algorithm Selection [88.23337313766353]
This work first provides a comprehensive statistical theory for transformers to perform ICL.
We show that transformers can implement a broad class of standard machine learning algorithms in context.
A emphsingle transformer can adaptively select different base ICL algorithms.
arXiv Detail & Related papers (2023-06-07T17:59:31Z) - INVICTUS: Optimizing Boolean Logic Circuit Synthesis via Synergistic
Learning and Search [18.558280701880136]
State-of-the-art logic synthesis algorithms have a large number of logic minimizations.
INVICTUS generates a sequence of logic minimizations based on a training dataset of previously seen designs.
arXiv Detail & Related papers (2023-05-22T15:50:42Z) - Persformer: A Transformer Architecture for Topological Machine Learning [0.0]
Persformer is the first Transformer neural network architecture that accepts persistence diagrams as input.
In this article, we introduce Persformer, the first Transformer neural network architecture that accepts persistence diagrams as input.
arXiv Detail & Related papers (2021-12-30T21:10:17Z) - Structured Reordering for Modeling Latent Alignments in Sequence
Transduction [86.94309120789396]
We present an efficient dynamic programming algorithm performing exact marginal inference of separable permutations.
The resulting seq2seq model exhibits better systematic generalization than standard models on synthetic problems and NLP tasks.
arXiv Detail & Related papers (2021-06-06T21:53:54Z) - Automated and Formal Synthesis of Neural Barrier Certificates for
Dynamical Models [70.70479436076238]
We introduce an automated, formal, counterexample-based approach to synthesise Barrier Certificates (BC)
The approach is underpinned by an inductive framework, which manipulates a candidate BC structured as a neural network, and a sound verifier, which either certifies the candidate's validity or generates counter-examples.
The outcomes show that we can synthesise sound BCs up to two orders of magnitude faster, with in particular a stark speedup on the verification engine.
arXiv Detail & Related papers (2020-07-07T07:39:42Z) - Model Fusion via Optimal Transport [64.13185244219353]
We present a layer-wise model fusion algorithm for neural networks.
We show that this can successfully yield "one-shot" knowledge transfer between neural networks trained on heterogeneous non-i.i.d. data.
arXiv Detail & Related papers (2019-10-12T22:07:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.