Geometric quantum machine learning of BQP$^A$ protocols and latent graph
classifiers
- URL: http://arxiv.org/abs/2402.03871v1
- Date: Tue, 6 Feb 2024 10:32:39 GMT
- Title: Geometric quantum machine learning of BQP$^A$ protocols and latent graph
classifiers
- Authors: Chukwudubem Umeano, Vincent E. Elfving, Oleksandr Kyriienko
- Abstract summary: Geometric quantum machine learning (GQML) aims to embed problem symmetries for learning efficient solving protocols.
In this Letter we consider Simon's problem for learning properties of Boolean functions, and show that this can be related to an unsupervised circuit classification problem.
- Score: 17.857341127079305
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Geometric quantum machine learning (GQML) aims to embed problem symmetries
for learning efficient solving protocols. However, the question remains if
(G)QML can be routinely used for constructing protocols with an exponential
separation from classical analogs. In this Letter we consider Simon's problem
for learning properties of Boolean functions, and show that this can be related
to an unsupervised circuit classification problem. Using the workflow of
geometric QML, we learn from first principles Simon's algorithm, thus
discovering an example of BQP$^A\neq$BPP protocol with respect to some dataset
(oracle $A$). Our key findings include the development of an equivariant
feature map for embedding Boolean functions, based on twirling with respect to
identified bitflip and permutational symmetries, and measurement based on
invariant observables with a sampling advantage. The proposed workflow points
to the importance of data embeddings and classical post-processing, while
keeping the variational circuit as a trivial identity operator. Next,
developing the intuition for the function learning, we visualize instances as
directed computational hypergraphs, and observe that the GQML protocol can
access their global topological features for distinguishing bijective and
surjective functions. Finally, we discuss the prospects for learning other
BQP$^A$-type protocols, and conjecture that this depends on the ability of
simplifying embeddings-based oracles $A$ applied as a linear combination of
unitaries.
Related papers
- Nearly query-optimal classical shadow estimation of unitary channels [6.715668514390893]
Classical shadow estimation is a powerful tool for learning properties of quantum states and quantum processes.
By querying an unknown unitary channel in quantum experiments, the goal is to learn a classical description of $mathcalU$.
Our protocol can also be applied to simultaneously predict many non-linear properties such as out-of-time-ordered correlators.
arXiv Detail & Related papers (2024-10-18T15:25:40Z) - Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - Can Geometric Quantum Machine Learning Lead to Advantage in Barcode Classification? [16.34646723046073]
We develop a geometric quantum machine learning (GQML) approach with embedded symmetries.
We show that quantum networks largely outperform their classical counterparts.
While the ability to achieve advantage largely depends on how data are loaded, we discuss how similar problems can benefit from quantum machine learning.
arXiv Detail & Related papers (2024-09-02T23:34:52Z) - Block encoding by signal processing [0.0]
We demonstrate that QSP-based techniques, such as Quantum Singular Value Transformation (QSVT) and Quantum Eigenvalue Transformation for Unitary Matrices (QETU) can themselves be efficiently utilized for BE implementation.
We present several examples of using QSVT and QETU algorithms, along with their combinations, to block encode Hamiltonians for lattice bosons.
We find that, while using QSVT for BE results in the best gate count scaling with the number of qubits per site, LOVE-LCU outperforms all other methods for operators acting on up to $lesssim11$ qubits.
arXiv Detail & Related papers (2024-08-29T18:00:02Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Equivariance with Learned Canonicalization Functions [77.32483958400282]
We show that learning a small neural network to perform canonicalization is better than using predefineds.
Our experiments show that learning the canonicalization function is competitive with existing techniques for learning equivariant functions across many tasks.
arXiv Detail & Related papers (2022-11-11T21:58:15Z) - Group-Invariant Quantum Machine Learning [0.0]
Quantum Machine Learning (QML) models are aimed at learning from data encoded in quantum states.
Group-invariant models produce outputs that remain invariant under the action of any element of the symmetry group $mathfrakG$ associated to the dataset.
We present theoretical results underpinning the design of $mathfrakG$-invariant models, and exemplify their application through several paradigmatic QML classification tasks.
arXiv Detail & Related papers (2022-05-04T18:04:32Z) - Adaptive neighborhood Metric learning [184.95321334661898]
We propose a novel distance metric learning algorithm, named adaptive neighborhood metric learning (ANML)
ANML can be used to learn both the linear and deep embeddings.
The emphlog-exp mean function proposed in our method gives a new perspective to review the deep metric learning methods.
arXiv Detail & Related papers (2022-01-20T17:26:37Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - A Functional Perspective on Learning Symmetric Functions with Neural
Networks [48.80300074254758]
We study the learning and representation of neural networks defined on measures.
We establish approximation and generalization bounds under different choices of regularization.
The resulting models can be learned efficiently and enjoy generalization guarantees that extend across input sizes.
arXiv Detail & Related papers (2020-08-16T16:34:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.