Reverse Map Projections as Equivariant Quantum Embeddings
- URL: http://arxiv.org/abs/2407.19906v2
- Date: Mon, 19 Aug 2024 09:40:32 GMT
- Title: Reverse Map Projections as Equivariant Quantum Embeddings
- Authors: Max Arnott, Dimitri Papaioannou, Kieran McDowall, Phalgun Lolur, Bambordé Baldé,
- Abstract summary: We introduce the novel class $(E_alpha)_alpha in [-infty,1)$ of reverse map projection embeddings.
Inspired by well-known map projections from the unit sphere onto its tangent planes, these embeddings address the common drawback of the amplitude embedding method.
We show how reverse map projections can be utilised as equivariant embeddings for quantum machine learning.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce the novel class $(E_\alpha)_{\alpha \in [-\infty,1)}$ of reverse map projection embeddings, each one defining a unique new method of encoding classical data into quantum states. Inspired by well-known map projections from the unit sphere onto its tangent planes, used in practice in cartography, these embeddings address the common drawback of the amplitude embedding method, wherein scalar multiples of data points are identified and information about the norm of data is lost. We show how reverse map projections can be utilised as equivariant embeddings for quantum machine learning. Using these methods, we can leverage symmetries in classical datasets to significantly strengthen performance on quantum machine learning tasks. Finally, we select four values of $\alpha$ with which to perform a simple classification task, taking $E_\alpha$ as the embedding and experimenting with both equivariant and non-equivariant setups. We compare their results alongside those of standard amplitude embedding.
Related papers
- Disentangled Representation Learning with the Gromov-Monge Gap [65.73194652234848]
Learning disentangled representations from unlabelled data is a fundamental challenge in machine learning.
We introduce a novel approach to disentangled representation learning based on quadratic optimal transport.
We demonstrate the effectiveness of our approach for quantifying disentanglement across four standard benchmarks.
arXiv Detail & Related papers (2024-07-10T16:51:32Z) - Label Learning Method Based on Tensor Projection [82.51786483693206]
We propose a label learning method based on tensor projection (LLMTP)
We extend the matrix projection transformation to tensor projection, so that the spatial structure information between views can be fully utilized.
In addition, we introduce the tensor Schatten $p$-norm regularization to make the clustering label matrices of different views as consistent as possible.
arXiv Detail & Related papers (2024-02-26T13:03:26Z) - Gradient-Based Feature Learning under Structured Data [57.76552698981579]
In the anisotropic setting, the commonly used spherical gradient dynamics may fail to recover the true direction.
We show that appropriate weight normalization that is reminiscent of batch normalization can alleviate this issue.
In particular, under the spiked model with a suitably large spike, the sample complexity of gradient-based training can be made independent of the information exponent.
arXiv Detail & Related papers (2023-09-07T16:55:50Z) - Classification of data with a qudit, a geometric approach [0.0]
We propose a model for data classification using isolated quantum $d$-level systems or else qudits.
We show that this geometrically inspired qudit model for classification is able to solve nonlinear classification problems using a small number of parameters only and without requiring entangling operations.
arXiv Detail & Related papers (2023-07-26T09:13:43Z) - 4D Panoptic Segmentation as Invariant and Equivariant Field Prediction [48.57732508537554]
We develop rotation-equivariant neural networks for 4D panoptic segmentation.
We show that our models achieve higher accuracy with lower computational costs compared to their non-equivariant counterparts.
Our method sets the new state-of-the-art performance and achieves 1st place on the Semantic KITTITI 4D Panoptic leaderboard.
arXiv Detail & Related papers (2023-03-28T00:20:37Z) - Neural Jacobian Fields: Learning Intrinsic Mappings of Arbitrary Meshes [38.157373733083894]
This paper introduces a framework designed to accurately predict piecewise linear mappings of arbitrary meshes via a neural network.
The framework is based on reducing the neural aspect to a prediction of a matrix for a single point, conditioned on a global shape descriptor.
By operating in the intrinsic gradient domain of each individual mesh, it allows the framework to predict highly-accurate mappings.
arXiv Detail & Related papers (2022-05-05T19:51:13Z) - Parametric t-Stochastic Neighbor Embedding With Quantum Neural Network [0.6946929968559495]
t-Stochastic Neighbor Embedding (t-SNE) is a non-parametric data visualization method in classical machine learning.
We propose to use quantum neural networks for parametric t-SNE to reflect the characteristics of high-dimensional quantum data on low-dimensional data.
arXiv Detail & Related papers (2022-02-09T02:49:54Z) - Smoothed Embeddings for Certified Few-Shot Learning [63.68667303948808]
We extend randomized smoothing to few-shot learning models that map inputs to normalized embeddings.
Our results are confirmed by experiments on different datasets.
arXiv Detail & Related papers (2022-02-02T18:19:04Z) - Asymmetric compressive learning guarantees with applications to
quantized sketches [15.814495790111323]
We present a framework to train audio event classification on large-scale datasets.
We study the relaxation where this map is allowed to be different for each phase.
We then instantiate this framework to the setting of quantized sketches, by proving that the LPD indeed holds for binary sketch contributions.
arXiv Detail & Related papers (2021-04-20T15:37:59Z) - LOCA: LOcal Conformal Autoencoder for standardized data coordinates [6.608924227377152]
We present a method for learning an embedding in $mathbbRd$ that is isometric to the latent variables of the manifold.
Our embedding is obtained using a LOcal Conformal Autoencoder (LOCA), an algorithm that constructs an embedding to rectify deformations.
We also apply LOCA to single-site Wi-Fi localization data, and to $3$-dimensional curved surface estimation.
arXiv Detail & Related papers (2020-04-15T17:49:37Z) - Quaternion Equivariant Capsule Networks for 3D Point Clouds [58.566467950463306]
We present a 3D capsule module for processing point clouds that is equivariant to 3D rotations and translations.
We connect dynamic routing between capsules to the well-known Weiszfeld algorithm.
Based on our operator, we build a capsule network that disentangles geometry from pose.
arXiv Detail & Related papers (2019-12-27T13:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.