Training Multilayer Perceptrons by Sampling with Quantum Annealers
- URL: http://arxiv.org/abs/2303.12352v1
- Date: Wed, 22 Mar 2023 07:40:01 GMT
- Title: Training Multilayer Perceptrons by Sampling with Quantum Annealers
- Authors: Frances Fengyi Yang and Michele Sasdelli and Tat-Jun Chin
- Abstract summary: Many neural networks for vision applications are feedforward structures.
Backpropagation is currently the most effective technique to trains for supervised learning.
- Score: 38.046974698940545
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A successful application of quantum annealing to machine learning is training
restricted Boltzmann machines (RBM). However, many neural networks for vision
applications are feedforward structures, such as multilayer perceptrons (MLP).
Backpropagation is currently the most effective technique to train MLPs for
supervised learning. This paper aims to be forward-looking by exploring the
training of MLPs using quantum annealers. We exploit an equivalence between
MLPs and energy-based models (EBM), which are a variation of RBMs with a
maximum conditional likelihood objective. This leads to a strategy to train
MLPs with quantum annealers as a sampling engine. We prove our setup for MLPs
with sigmoid activation functions and one hidden layer, and demonstrated
training of binary image classifiers on small subsets of the MNIST and
Fashion-MNIST datasets using the D-Wave quantum annealer. Although problem
sizes that are feasible on current annealers are limited, we obtained
comprehensive results on feasible instances that validate our ideas. Our work
establishes the potential of quantum computing for training MLPs.
Related papers
- Exploring Tensor Network Algorithms as a Quantum-Inspired Method for Quantum Extreme Learning Machine [0.26013878609420266]
Quantum Extreme Learning Machine (QELM) has emerged as a promising hybrid quantum machine learning (QML) method.
We explore how quantum-inspired techniques like tensor networks (TNs) can be used for the QELM algorithm.
This study also underscores the potential of tensor networks as quantum-inspired algorithms to enhance the capability of quantum machine learning algorithms to study datasets with large numbers of features.
arXiv Detail & Related papers (2025-03-07T16:03:24Z) - Leveraging Pre-Trained Neural Networks to Enhance Machine Learning with Variational Quantum Circuits [48.33631905972908]
We introduce an innovative approach that utilizes pre-trained neural networks to enhance Variational Quantum Circuits (VQC)
This technique effectively separates approximation error from qubit count and removes the need for restrictive conditions.
Our results extend to applications such as human genome analysis, demonstrating the broad applicability of our approach.
arXiv Detail & Related papers (2024-11-13T12:03:39Z) - KANQAS: Kolmogorov-Arnold Network for Quantum Architecture Search [0.0]
We evaluate the practicality of Kolmogorov-Arnold Networks (KANs) in quantum state preparation and quantum chemistry.
In quantum state preparation, our results show that in a noiseless scenario, the probability of success and the number of optimal quantum circuit configurations to generate the multi-qubit maximally entangled states are $2$ to $5times$ higher than Multi-Layer perceptions (MLPs)
In tackling quantum chemistry problems, we enhance the recently proposed QAS algorithm by integrating Curriculum Reinforcement Learning (KAN) with a KAN structure instead of the traditional structure.
arXiv Detail & Related papers (2024-06-25T15:17:01Z) - Active learning of Boltzmann samplers and potential energies with quantum mechanical accuracy [1.7633275579210346]
We develop an approach combining enhanced sampling with deep generative models and active learning of a machine learning potential.
We apply this method to study the isomerization of an ultrasmall silver nanocluster, belonging to a set of systems with diverse applications in the fields of medicine and biology.
arXiv Detail & Related papers (2024-01-29T19:01:31Z) - QKSAN: A Quantum Kernel Self-Attention Network [53.96779043113156]
A Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to combine the data representation merit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM.
A Quantum Kernel Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which ingeniously incorporates the Deferred Measurement Principle (DMP) and conditional measurement techniques.
Four QKSAN sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary classification on MNIST and Fashion MNIST.
arXiv Detail & Related papers (2023-08-25T15:08:19Z) - NTK-approximating MLP Fusion for Efficient Language Model Fine-tuning [40.994306592119266]
Fine-tuning a pre-trained language model (PLM) emerges as the predominant strategy in many natural language processing applications.
Some general approaches (e.g. quantization and distillation) have been widely studied to reduce the compute/memory of PLM fine-tuning.
We propose to coin a lightweight PLM through NTK-approximating modules in fusion.
arXiv Detail & Related papers (2023-07-18T03:12:51Z) - PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language
Models [52.09865918265002]
We propose a novel quantize before fine-tuning'' framework, PreQuant.
PreQuant is compatible with various quantization strategies, with outlier-aware fine-tuning incorporated to correct the induced quantization error.
We demonstrate the effectiveness of PreQuant on the GLUE benchmark using BERT, RoBERTa, and T5.
arXiv Detail & Related papers (2023-05-30T08:41:33Z) - Visual Simulation Software Demonstration for Quantum Multi-Drone
Reinforcement Learning [14.299752746509348]
This paper presents a visual simulation software framework for a novel QMARL algorithm to control autonomous multi-drone systems.
Our proposed QMARL framework accomplishes reasonable reward convergence and service quality performance with fewer trainable parameters than the classical MARL.
arXiv Detail & Related papers (2022-11-24T06:08:24Z) - Decomposition of Matrix Product States into Shallow Quantum Circuits [62.5210028594015]
tensor network (TN) algorithms can be mapped to parametrized quantum circuits (PQCs)
We propose a new protocol for approximating TN states using realistic quantum circuits.
Our results reveal one particular protocol, involving sequential growth and optimization of the quantum circuit, to outperform all other methods.
arXiv Detail & Related papers (2022-09-01T17:08:41Z) - Quantum Multi-Agent Meta Reinforcement Learning [22.17932723673392]
We re-design multi-agent reinforcement learning based on the unique characteristics of quantum neural networks (QNNs)
We propose quantum meta MARL (QM2ARL) that first applies angle training for meta-QNN learning, followed by pole training for few-shot or local-QNN training.
arXiv Detail & Related papers (2022-08-22T22:46:52Z) - Energy-Efficient and Federated Meta-Learning via Projected Stochastic
Gradient Ascent [79.58680275615752]
We propose an energy-efficient federated meta-learning framework.
We assume each task is owned by a separate agent, so a limited number of tasks is used to train a meta-model.
arXiv Detail & Related papers (2021-05-31T08:15:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.