Representation Theory for Geometric Quantum Machine Learning
- URL: http://arxiv.org/abs/2210.07980v1
- Date: Fri, 14 Oct 2022 17:25:36 GMT
- Title: Representation Theory for Geometric Quantum Machine Learning
- Authors: Michael Ragone, Paolo Braccia, Quynh T. Nguyen, Louis Schatzki,
Patrick J. Coles, Frederic Sauvage, Martin Larocca, M. Cerezo
- Abstract summary: Recent advances in classical machine learning have shown that creating models with inductive biases encoding the symmetries of a problem can greatly improve performance.
Geometric Quantum Machine Learning (GQML) will play a crucial role in developing problem-specific and quantum-aware models.
We present an introduction to representation theory tools from the optics of quantum learning, driven by key examples involving discrete and continuous groups.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advances in classical machine learning have shown that creating models
with inductive biases encoding the symmetries of a problem can greatly improve
performance. Importation of these ideas, combined with an existing rich body of
work at the nexus of quantum theory and symmetry, has given rise to the field
of Geometric Quantum Machine Learning (GQML). Following the success of its
classical counterpart, it is reasonable to expect that GQML will play a crucial
role in developing problem-specific and quantum-aware models capable of
achieving a computational advantage. Despite the simplicity of the main idea of
GQML -- create architectures respecting the symmetries of the data -- its
practical implementation requires a significant amount of knowledge of group
representation theory. We present an introduction to representation theory
tools from the optics of quantum learning, driven by key examples involving
discrete and continuous groups. These examples are sewn together by an
exposition outlining the formal capture of GQML symmetries via "label
invariance under the action of a group representation", a brief (but rigorous)
tour through finite and compact Lie group representation theory, a
reexamination of ubiquitous tools like Haar integration and twirling, and an
overview of some successful strategies for detecting symmetries.
Related papers
- Provably Trainable Rotationally Equivariant Quantum Machine Learning [0.6435156676256051]
We introduce a family of rotationally equivariant QML models built upon the quantum Fourier transform.
We numerically test our models on a dataset of simulated scanning tunnelling microscope images of phosphorus impurities in silicon.
arXiv Detail & Related papers (2023-11-10T05:10:06Z) - ${\rm E}(3)$-Equivariant Actor-Critic Methods for Cooperative Multi-Agent Reinforcement Learning [7.712824077083934]
We focus on exploiting Euclidean symmetries inherent in certain cooperative multi-agent reinforcement learning problems.
We design neural network architectures with symmetric constraints embedded as an inductive bias for multi-agent actor-critic methods.
arXiv Detail & Related papers (2023-08-23T00:18:17Z) - FAENet: Frame Averaging Equivariant GNN for Materials Modeling [123.19473575281357]
We introduce a flexible framework relying on frameaveraging (SFA) to make any model E(3)-equivariant or invariant through data transformations.
We prove the validity of our method theoretically and empirically demonstrate its superior accuracy and computational scalability in materials modeling.
arXiv Detail & Related papers (2023-04-28T21:48:31Z) - Reflection Equivariant Quantum Neural Networks for Enhanced Image
Classification [0.7232471205719458]
We build new machine learning models which explicitly respect the symmetries inherent in their data, so-called geometric quantum machine learning (GQML)
We find that these networks are capable of consistently and significantly outperforming generic ansatze on complicated real-world image datasets.
arXiv Detail & Related papers (2022-12-01T04:10:26Z) - Exploiting symmetry in variational quantum machine learning [0.5541644538483947]
Variational quantum machine learning is an extensively studied application of near-term quantum computers.
We show how a standard gateset can be transformed into an equivariant gateset that respects the symmetries of the problem at hand.
We benchmark the proposed methods on two toy problems that feature a non-trivial symmetry and observe a substantial increase in generalization performance.
arXiv Detail & Related papers (2022-05-12T17:01:41Z) - Theory of Quantum Generative Learning Models with Maximum Mean
Discrepancy [67.02951777522547]
We study learnability of quantum circuit Born machines (QCBMs) and quantum generative adversarial networks (QGANs)
We first analyze the generalization ability of QCBMs and identify their superiorities when the quantum devices can directly access the target distribution.
Next, we prove how the generalization error bound of QGANs depends on the employed Ansatz, the number of qudits, and input states.
arXiv Detail & Related papers (2022-05-10T08:05:59Z) - Group-Invariant Quantum Machine Learning [0.0]
Quantum Machine Learning (QML) models are aimed at learning from data encoded in quantum states.
Group-invariant models produce outputs that remain invariant under the action of any element of the symmetry group $mathfrakG$ associated to the dataset.
We present theoretical results underpinning the design of $mathfrakG$-invariant models, and exemplify their application through several paradigmatic QML classification tasks.
arXiv Detail & Related papers (2022-05-04T18:04:32Z) - Symmetry Group Equivariant Architectures for Physics [52.784926970374556]
In the domain of machine learning, an awareness of symmetries has driven impressive performance breakthroughs.
We argue that both the physics community and the broader machine learning community have much to understand.
arXiv Detail & Related papers (2022-03-11T18:27:04Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Tensor network models of AdS/qCFT [69.6561021616688]
We introduce the notion of a quasiperiodic conformal field theory (qCFT)
We show that qCFT can be best understood as belonging to a paradigm of discrete holography.
arXiv Detail & Related papers (2020-04-08T18:00:05Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.