Quantum Architecture Search with Unsupervised Representation Learning
- URL: http://arxiv.org/abs/2401.11576v2
- Date: Tue, 19 Mar 2024 12:53:24 GMT
- Title: Quantum Architecture Search with Unsupervised Representation Learning
- Authors: Yize Sun, Zixin Wu, Yunpu Ma, Volker Tresp,
- Abstract summary: We propose a framework for unsupervised representation learning for quantum architecture search (QAS)
Our framework is predictor-free eliminating the need for a large number of labeled quantum circuits.
The results show our framework can more efficiently get well-performing candidate circuits within a limited number of searches.
- Score: 24.698519892763283
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Utilizing unsupervised representation learning for quantum architecture search (QAS) represents a cutting-edge approach poised to realize potential quantum advantage on Noisy Intermediate-Scale Quantum (NISQ) devices. Most QAS algorithms combine their search space and search algorithms together and thus generally require evaluating a large number of quantum circuits during the search process. Predictor-based QAS algorithms can alleviate this problem by directly estimating the performance of circuits according to their structures. However, a high-performance predictor generally requires very time-consuming labeling to obtain a large number of labeled quantum circuits. Recently, a classical neural architecture search algorithm Arch2vec inspires us by showing that architecture search can benefit from decoupling unsupervised representation learning from the search process. Whether unsupervised representation learning can help QAS without any predictor is still an open topic. In this work, we propose a framework QAS with unsupervised representation learning and visualize how unsupervised architecture representation learning encourages quantum circuit architectures with similar connections and operators to cluster together. Specifically, our framework enables the process of QAS to be decoupled from unsupervised architecture representation learning so that the learned representation can be directly applied to different downstream applications. Furthermore, our framework is predictor-free eliminating the need for a large number of labeled quantum circuits. During the search process, we use two algorithms REINFORCE and Bayesian Optimization to directly search on the latent representation, and compare them with the method Random Search. The results show our framework can more efficiently get well-performing candidate circuits within a limited number of searches.
Related papers
- Qubit-Wise Architecture Search Method for Variational Quantum Circuits [11.790545710021593]
We propose a novel qubit-wise architec-ture search (QWAS) method, which progres-sively search one-qubit configuration per stage.
Our proposed method can balance the exploration and exploitation of cir-cuit performance and size in some real-world tasks, such as MNIST, Fashion and MOSI.
arXiv Detail & Related papers (2024-03-07T07:08:57Z) - Quantum Subroutine for Variance Estimation: Algorithmic Design and Applications [80.04533958880862]
Quantum computing sets the foundation for new ways of designing algorithms.
New challenges arise concerning which field quantum speedup can be achieved.
Looking for the design of quantum subroutines that are more efficient than their classical counterpart poses solid pillars to new powerful quantum algorithms.
arXiv Detail & Related papers (2024-02-26T09:32:07Z) - QArchSearch: A Scalable Quantum Architecture Search Package [1.725192300740999]
We present textttQArchSearch, an AI based quantum architecture search package with the textttQTensor library as a backend.
We show that the search package is able to efficiently scale the search to large quantum circuits and enables the exploration of more complex models for different quantum applications.
arXiv Detail & Related papers (2023-10-11T20:00:33Z) - GSQAS: Graph Self-supervised Quantum Architecture Search [0.18899300124593643]
Existing Quantum Architecture Search (QAS) algorithms require to evaluate a large number of quantum circuits during the search process.
We propose GSQAS, a graph self-supervised QAS, which trains a predictor based on self-supervised learning.
GSQAS outperforms the state-of-the-art predictor-based QAS, achieving better performance with fewer labeled circuits.
arXiv Detail & Related papers (2023-03-22T08:35:28Z) - Unified Functional Hashing in Automatic Machine Learning [58.77232199682271]
We show that large efficiency gains can be obtained by employing a fast unified functional hash.
Our hash is "functional" in that it identifies equivalent candidates even if they were represented or coded differently.
We show dramatic improvements on multiple AutoML domains, including neural architecture search and algorithm discovery.
arXiv Detail & Related papers (2023-02-10T18:50:37Z) - Quantum matching pursuit: A quantum algorithm for sparse representations [3.4376560669160394]
Representing signals with sparse vectors has a wide range of applications that range from image and video coding to shape representation and health monitoring.
Quantum computing has recently shown promising speed-ups in many representation learning tasks.
arXiv Detail & Related papers (2022-08-08T13:50:57Z) - MQBench: Towards Reproducible and Deployable Model Quantization
Benchmark [53.12623958951738]
MQBench is a first attempt to evaluate, analyze, and benchmark the and deployability for model quantization algorithms.
We choose multiple platforms for real-world deployments, including CPU, GPU, ASIC, DSP, and evaluate extensive state-of-the-art quantization algorithms.
We conduct a comprehensive analysis and find considerable intuitive or counter-intuitive insights.
arXiv Detail & Related papers (2021-11-05T23:38:44Z) - Quantum Embedding Search for Quantum Machine Learning [2.7612093695074456]
We introduce a novel quantum embedding search algorithm (QES), pronounced as "quest"
We establish the connection between the structures of quantum embedding and the representations of directed multi-graphs, enabling a well-defined search space.
We demonstrate the feasibility of our proposed approach on synthesis and Iris datasets, which empirically shows that quantum embedding architecture by QES outperforms manual designs.
arXiv Detail & Related papers (2021-05-25T11:50:57Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - CATCH: Context-based Meta Reinforcement Learning for Transferrable
Architecture Search [102.67142711824748]
CATCH is a novel Context-bAsed meTa reinforcement learning algorithm for transferrable arChitecture searcH.
The combination of meta-learning and RL allows CATCH to efficiently adapt to new tasks while being agnostic to search spaces.
It is also capable of handling cross-domain architecture search as competitive networks on ImageNet, COCO, and Cityscapes are identified.
arXiv Detail & Related papers (2020-07-18T09:35:53Z) - Scalable NAS with Factorizable Architectural Parameters [102.51428615447703]
Neural Architecture Search (NAS) is an emerging topic in machine learning and computer vision.
This paper presents a scalable algorithm by factorizing a large set of candidate operators into smaller subspaces.
With a small increase in search costs and no extra costs in re-training, we find interesting architectures that were not explored before.
arXiv Detail & Related papers (2019-12-31T10:26:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.