Quantum-inspired activation functions and quantum Chebyshev-polynomial network
- URL: http://arxiv.org/abs/2404.05901v3
- Date: Wed, 23 Oct 2024 13:28:28 GMT
- Title: Quantum-inspired activation functions and quantum Chebyshev-polynomial network
- Authors: Shaozhi Li, M Sabbir Salek, Yao Wang, Mashrur Chowdhury,
- Abstract summary: We investigate functional expressibility of quantum circuits integrated within a convolutional neural network (CNN)
We develop a hybrid quantum Chebyshev-polynomial network (QCPN) based on the properties of quantum activation functions.
Our findings suggest that quantum-inspired activation functions can reduce model depth while maintaining high learning capability.
- Score: 6.09437748873686
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Driven by the significant advantages offered by quantum computing, research in quantum machine learning has increased in recent years. While quantum speed-up has been demonstrated in some applications of quantum machine learning, a comprehensive understanding of its underlying mechanisms for improved performance remains elusive. Our study address this problem by investigating the functional expressibility of quantum circuits integrated within a convolutional neural network (CNN). Through numerical experiments on the MNIST, Fashion MNIST, and Letter datasets, our hybrid quantum-classical CNN model demonstrates superior feature selection capabilities and substantially reduces the required training steps compared to classical CNNs. Notably, we observe similar performance improvements when incorporating three other quantum-inspired activation functions in classical neural networks, indicating the benefits of adopting quantum-inspired activation functions. Additionally, we developed a hybrid quantum Chebyshev-polynomial network (QCPN) based on the properties of quantum activation functions. We demonstrate that a three-layer QCPN can approximate any continuous function, a feat not achievable by a standard three-layer classical neural network. Our findings suggest that quantum-inspired activation functions can reduce model depth while maintaining high learning capability, making them a promising approach for optimizing large-scale machine-learning models. We also outline future research directions for leveraging quantum advantages in machine learning, aiming to unlock further potential in this rapidly evolving field.
Related papers
- Let the Quantum Creep In: Designing Quantum Neural Network Models by
Gradually Swapping Out Classical Components [1.024113475677323]
Modern AI systems are often built on neural networks.
We propose a framework where classical neural network layers are gradually replaced by quantum layers.
We conduct numerical experiments on image classification datasets to demonstrate the change of performance brought by the systematic introduction of quantum components.
arXiv Detail & Related papers (2024-09-26T07:01:29Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Quantum Generative Adversarial Networks: Bridging Classical and Quantum
Realms [0.6827423171182153]
We explore the synergistic fusion of classical and quantum computing paradigms within the realm of Generative Adversarial Networks (GANs)
Our objective is to seamlessly integrate quantum computational elements into the conventional GAN architecture, thereby unlocking novel pathways for enhanced training processes.
This research is positioned at the forefront of quantum-enhanced machine learning, presenting a critical stride towards harnessing the computational power of quantum systems.
arXiv Detail & Related papers (2023-12-15T16:51:36Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Recent Advances for Quantum Neural Networks in Generative Learning [98.88205308106778]
Quantum generative learning models (QGLMs) may surpass their classical counterparts.
We review the current progress of QGLMs from the perspective of machine learning.
We discuss the potential applications of QGLMs in both conventional machine learning tasks and quantum physics.
arXiv Detail & Related papers (2022-06-07T07:32:57Z) - QDCNN: Quantum Dilated Convolutional Neural Network [1.52292571922932]
We propose a novel hybrid quantum-classical algorithm called quantum dilated convolutional neural networks (QDCNNs)
Our method extends the concept of dilated convolution, which has been widely applied in modern deep learning algorithms, to the context of hybrid neural networks.
The proposed QDCNNs are able to capture larger context during the quantum convolution process while reducing the computational cost.
arXiv Detail & Related papers (2021-10-29T10:24:34Z) - On exploring the potential of quantum auto-encoder for learning quantum systems [60.909817434753315]
We devise three effective QAE-based learning protocols to address three classically computational hard learning problems.
Our work sheds new light on developing advanced quantum learning algorithms to accomplish hard quantum physics and quantum information processing tasks.
arXiv Detail & Related papers (2021-06-29T14:01:40Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - Quantum neural networks with deep residual learning [29.929891641757273]
In this paper, a novel quantum neural network with deep residual learning (ResQNN) is proposed.
Our ResQNN is able to learn an unknown unitary and get remarkable performance.
arXiv Detail & Related papers (2020-12-14T18:11:07Z) - The power of quantum neural networks [3.327474729829121]
In the near-term, however, the benefits of quantum machine learning are not so clear.
We use tools from information geometry to define a notion of expressibility for quantum and classical models.
We show that quantum neural networks are able to achieve a significantly better effective dimension than comparable classical neural networks.
arXiv Detail & Related papers (2020-10-30T18:13:32Z) - Experimental Quantum Generative Adversarial Networks for Image
Generation [93.06926114985761]
We experimentally achieve the learning and generation of real-world hand-written digit images on a superconducting quantum processor.
Our work provides guidance for developing advanced quantum generative models on near-term quantum devices.
arXiv Detail & Related papers (2020-10-13T06:57:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.