Provably Trainable Rotationally Equivariant Quantum Machine Learning
- URL: http://arxiv.org/abs/2311.05873v3
- Date: Sun, 14 Jan 2024 23:27:03 GMT
- Title: Provably Trainable Rotationally Equivariant Quantum Machine Learning
- Authors: Maxwell T. West, Jamie Heredge, Martin Sevior and Muhammad Usman
- Abstract summary: We introduce a family of rotationally equivariant QML models built upon the quantum Fourier transform.
We numerically test our models on a dataset of simulated scanning tunnelling microscope images of phosphorus impurities in silicon.
- Score: 0.6435156676256051
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Exploiting the power of quantum computation to realise superior machine
learning algorithmshas been a major research focus of recent years, but the
prospects of quantum machine learning (QML) remain dampened by considerable
technical challenges. A particularly significant issue is that generic QML
models suffer from so-called barren plateaus in their training landscapes --
large regions where cost function gradients vanish exponentially in the number
of qubits employed, rendering large models effectively untrainable. A leading
strategy for combating this effect is to build problem-specific models which
take into account the symmetries of their data in order to focus on a smaller,
relevant subset of Hilbert space. In this work, we introduce a family of
rotationally equivariant QML models built upon the quantum Fourier transform,
and leverage recent insights from the Lie-algebraic study of QML models to
prove that (a subset of) our models do not exhibit barren plateaus. In addition
to our analytical results we numerically test our rotationally equivariant
models on a dataset of simulated scanning tunnelling microscope images of
phosphorus impurities in silicon, where rotational symmetry naturally arises,
and find that they dramatically outperform their generic counterparts in
practice.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Approximately Equivariant Quantum Neural Network for $p4m$ Group
Symmetries in Images [30.01160824817612]
This work proposes equivariant Quantum Convolutional Neural Networks (EquivQCNNs) for image classification under planar $p4m$ symmetry.
We present the results tested in different use cases, such as phase detection of the 2D Ising model and classification of the extended MNIST dataset.
arXiv Detail & Related papers (2023-10-03T18:01:02Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Classical-to-Quantum Transfer Learning Facilitates Machine Learning with Variational Quantum Circuit [62.55763504085508]
We prove that a classical-to-quantum transfer learning architecture using a Variational Quantum Circuit (VQC) improves the representation and generalization (estimation error) capabilities of the VQC model.
We show that the architecture of classical-to-quantum transfer learning leverages pre-trained classical generative AI models, making it easier to find the optimal parameters for the VQC in the training stage.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Lorentz group equivariant autoencoders [6.858459233149096]
Lorentz group autoencoder (LGAE)
We develop an autoencoder model equivariant with respect to the proper, orthochronous Lorentz group $mathrmSO+(2,1)$, with a latent space living in the representations of the group.
We present our architecture and several experimental results on jets at the LHC and find it outperforms graph and convolutional neural network baseline models on several compression, reconstruction, and anomaly detection metrics.
arXiv Detail & Related papers (2022-12-14T17:19:46Z) - Reflection Equivariant Quantum Neural Networks for Enhanced Image
Classification [0.7232471205719458]
We build new machine learning models which explicitly respect the symmetries inherent in their data, so-called geometric quantum machine learning (GQML)
We find that these networks are capable of consistently and significantly outperforming generic ansatze on complicated real-world image datasets.
arXiv Detail & Related papers (2022-12-01T04:10:26Z) - Group-Invariant Quantum Machine Learning [0.0]
Quantum Machine Learning (QML) models are aimed at learning from data encoded in quantum states.
Group-invariant models produce outputs that remain invariant under the action of any element of the symmetry group $mathfrakG$ associated to the dataset.
We present theoretical results underpinning the design of $mathfrakG$-invariant models, and exemplify their application through several paradigmatic QML classification tasks.
arXiv Detail & Related papers (2022-05-04T18:04:32Z) - Fermionic approach to variational quantum simulation of Kitaev spin
models [50.92854230325576]
Kitaev spin models are well known for being exactly solvable in a certain parameter regime via a mapping to free fermions.
We use classical simulations to explore a novel variational ansatz that takes advantage of this fermionic representation.
We also comment on the implications of our results for simulating non-Abelian anyons on quantum computers.
arXiv Detail & Related papers (2022-04-11T18:00:01Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Variational learning of quantum ground states on spiking neuromorphic
hardware [0.0]
High-dimensional sampling spaces and transient autocorrelations confront neural networks with a challenging computational bottleneck.
Compared to conventional neural networks, physical-model devices offer a fast, efficient and inherently parallel substrate.
We demonstrate the ability of a neuromorphic chip to represent the ground states of quantum spin models by variational energy minimization.
arXiv Detail & Related papers (2021-09-30T14:39:45Z) - Quantum-tailored machine-learning characterization of a superconducting
qubit [50.591267188664666]
We develop an approach to characterize the dynamics of a quantum device and learn device parameters.
This approach outperforms physics-agnostic recurrent neural networks trained on numerically generated and experimental data.
This demonstration shows how leveraging domain knowledge improves the accuracy and efficiency of this characterization task.
arXiv Detail & Related papers (2021-06-24T15:58:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.