Group-Invariant Quantum Machine Learning
- URL: http://arxiv.org/abs/2205.02261v1
- Date: Wed, 4 May 2022 18:04:32 GMT
- Title: Group-Invariant Quantum Machine Learning
- Authors: Martin Larocca, Frederic Sauvage, Faris M. Sbahi, Guillaume Verdon,
Patrick J. Coles, M. Cerezo
- Abstract summary: Quantum Machine Learning (QML) models are aimed at learning from data encoded in quantum states.
Group-invariant models produce outputs that remain invariant under the action of any element of the symmetry group $mathfrakG$ associated to the dataset.
We present theoretical results underpinning the design of $mathfrakG$-invariant models, and exemplify their application through several paradigmatic QML classification tasks.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum Machine Learning (QML) models are aimed at learning from data encoded
in quantum states. Recently, it has been shown that models with little to no
inductive biases (i.e., with no assumptions about the problem embedded in the
model) are likely to have trainability and generalization issues, especially
for large problem sizes. As such, it is fundamental to develop schemes that
encode as much information as available about the problem at hand. In this work
we present a simple, yet powerful, framework where the underlying invariances
in the data are used to build QML models that, by construction, respect those
symmetries. These so-called group-invariant models produce outputs that remain
invariant under the action of any element of the symmetry group $\mathfrak{G}$
associated to the dataset. We present theoretical results underpinning the
design of $\mathfrak{G}$-invariant models, and exemplify their application
through several paradigmatic QML classification tasks including cases when
$\mathfrak{G}$ is a continuous Lie group and also when it is a discrete
symmetry group. Notably, our framework allows us to recover, in an elegant way,
several well known algorithms for the literature, as well as to discover new
ones. Taken together, we expect that our results will help pave the way towards
a more geometric and group-theoretic approach to QML model design.
Related papers
- Promises and Pitfalls of Generative Masked Language Modeling: Theoretical Framework and Practical Guidelines [74.42485647685272]
We focus on Generative Masked Language Models (GMLMs)
We train a model to fit conditional probabilities of the data distribution via masking, which are subsequently used as inputs to a Markov Chain to draw samples from the model.
We adapt the T5 model for iteratively-refined parallel decoding, achieving 2-3x speedup in machine translation with minimal sacrifice in quality.
arXiv Detail & Related papers (2024-07-22T18:00:00Z) - Provably Trainable Rotationally Equivariant Quantum Machine Learning [0.6435156676256051]
We introduce a family of rotationally equivariant QML models built upon the quantum Fourier transform.
We numerically test our models on a dataset of simulated scanning tunnelling microscope images of phosphorus impurities in silicon.
arXiv Detail & Related papers (2023-11-10T05:10:06Z) - Approximately Equivariant Quantum Neural Network for $p4m$ Group
Symmetries in Images [30.01160824817612]
This work proposes equivariant Quantum Convolutional Neural Networks (EquivQCNNs) for image classification under planar $p4m$ symmetry.
We present the results tested in different use cases, such as phase detection of the 2D Ising model and classification of the extended MNIST dataset.
arXiv Detail & Related papers (2023-10-03T18:01:02Z) - FAENet: Frame Averaging Equivariant GNN for Materials Modeling [123.19473575281357]
We introduce a flexible framework relying on frameaveraging (SFA) to make any model E(3)-equivariant or invariant through data transformations.
We prove the validity of our method theoretically and empirically demonstrate its superior accuracy and computational scalability in materials modeling.
arXiv Detail & Related papers (2023-04-28T21:48:31Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Reflection Equivariant Quantum Neural Networks for Enhanced Image
Classification [0.7232471205719458]
We build new machine learning models which explicitly respect the symmetries inherent in their data, so-called geometric quantum machine learning (GQML)
We find that these networks are capable of consistently and significantly outperforming generic ansatze on complicated real-world image datasets.
arXiv Detail & Related papers (2022-12-01T04:10:26Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Theoretical Guarantees for Permutation-Equivariant Quantum Neural
Networks [0.0]
We show how to build equivariant quantum neural networks (QNNs)
We prove that they do not suffer from barren plateaus, quickly reach overparametrization, and generalize well from small amounts of data.
Our work provides the first theoretical guarantees for equivariant QNNs, thus indicating the extreme power and potential of GQML.
arXiv Detail & Related papers (2022-10-18T16:35:44Z) - Representation Theory for Geometric Quantum Machine Learning [0.0]
Recent advances in classical machine learning have shown that creating models with inductive biases encoding the symmetries of a problem can greatly improve performance.
Geometric Quantum Machine Learning (GQML) will play a crucial role in developing problem-specific and quantum-aware models.
We present an introduction to representation theory tools from the optics of quantum learning, driven by key examples involving discrete and continuous groups.
arXiv Detail & Related papers (2022-10-14T17:25:36Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Learning Gaussian Graphical Models via Multiplicative Weights [54.252053139374205]
We adapt an algorithm of Klivans and Meka based on the method of multiplicative weight updates.
The algorithm enjoys a sample complexity bound that is qualitatively similar to others in the literature.
It has a low runtime $O(mp2)$ in the case of $m$ samples and $p$ nodes, and can trivially be implemented in an online manner.
arXiv Detail & Related papers (2020-02-20T10:50:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.