Quantum Ridgelet Transform: Winning Lottery Ticket of Neural Networks
with Quantum Computation
- URL: http://arxiv.org/abs/2301.11936v2
- Date: Mon, 11 Sep 2023 13:00:29 GMT
- Title: Quantum Ridgelet Transform: Winning Lottery Ticket of Neural Networks
with Quantum Computation
- Authors: Hayata Yamasaki, Sathyawageeswar Subramanian, Satoshi Hayakawa, Sho
Sonoda
- Abstract summary: Ridgelet transform has been a fundamental mathematical tool in the theoretical studies of neural networks.
We develop a quantum ridgelet transform (QRT) which implements the ridgelet transform of a quantum state within a linear runtime $exp(O(D))$ of quantum computation.
As an application, we show that one can use QRT as a fundamental subroutine for QML to efficiently find a sparse trainable subnetwork of large shallow wide neural networks.
- Score: 8.947825738917869
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A significant challenge in the field of quantum machine learning (QML) is to
establish applications of quantum computation to accelerate common tasks in
machine learning such as those for neural networks. Ridgelet transform has been
a fundamental mathematical tool in the theoretical studies of neural networks,
but the practical applicability of ridgelet transform to conducting learning
tasks was limited since its numerical implementation by conventional classical
computation requires an exponential runtime $\exp(O(D))$ as data dimension $D$
increases. To address this problem, we develop a quantum ridgelet transform
(QRT), which implements the ridgelet transform of a quantum state within a
linear runtime $O(D)$ of quantum computation. As an application, we also show
that one can use QRT as a fundamental subroutine for QML to efficiently find a
sparse trainable subnetwork of large shallow wide neural networks without
conducting large-scale optimization of the original network. This application
discovers an efficient way in this regime to demonstrate the lottery ticket
hypothesis on finding such a sparse trainable neural network. These results
open an avenue of QML for accelerating learning tasks with commonly used
classical neural networks.
Related papers
- Training Classical Neural Networks by Quantum Machine Learning [9.002305736350833]
This work proposes a training scheme for classical neural networks (NNs) that utilizes the exponentially large Hilbert space of a quantum system.
Unlike existing quantum machine learning (QML) methods, the results obtained from quantum computers using our approach can be directly used on classical computers.
arXiv Detail & Related papers (2024-02-26T10:16:21Z) - Reservoir Computing via Quantum Recurrent Neural Networks [0.5999777817331317]
Existing VQC or QNN-based methods require significant computational resources to perform gradient-based optimization of quantum circuit parameters.
In this work, we approach sequential modeling by applying a reservoir computing (RC) framework to quantum recurrent neural networks (QRNN-RC)
Our numerical simulations show that the QRNN-RC can reach results comparable to fully trained QRNN models for several function approximation and time series tasks.
arXiv Detail & Related papers (2022-11-04T17:30:46Z) - Accelerating the training of single-layer binary neural networks using
the HHL quantum algorithm [58.720142291102135]
We show that useful information can be extracted from the quantum-mechanical implementation of Harrow-Hassidim-Lloyd (HHL)
This paper shows, however, that useful information can be extracted from the quantum-mechanical implementation of HHL, and used to reduce the complexity of finding the solution on the classical side.
arXiv Detail & Related papers (2022-10-23T11:58:05Z) - Optimizing Tensor Network Contraction Using Reinforcement Learning [86.05566365115729]
We propose a Reinforcement Learning (RL) approach combined with Graph Neural Networks (GNN) to address the contraction ordering problem.
The problem is extremely challenging due to the huge search space, the heavy-tailed reward distribution, and the challenging credit assignment.
We show how a carefully implemented RL-agent that uses a GNN as the basic policy construct can address these challenges.
arXiv Detail & Related papers (2022-04-18T21:45:13Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - A Quantum Convolutional Neural Network for Image Classification [7.745213180689952]
We propose a novel neural network model named Quantum Convolutional Neural Network (QCNN)
QCNN is based on implementable quantum circuits and has a similar structure as classical convolutional neural networks.
Numerical simulation results on the MNIST dataset demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-08T06:47:34Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z) - Quantum neural networks with deep residual learning [29.929891641757273]
In this paper, a novel quantum neural network with deep residual learning (ResQNN) is proposed.
Our ResQNN is able to learn an unknown unitary and get remarkable performance.
arXiv Detail & Related papers (2020-12-14T18:11:07Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.