Enforcing exact permutation and rotational symmetries in the application of quantum neural network on point cloud datasets
- URL: http://arxiv.org/abs/2405.11150v3
- Date: Mon, 17 Jun 2024 02:34:28 GMT
- Title: Enforcing exact permutation and rotational symmetries in the application of quantum neural network on point cloud datasets
- Authors: Zhelun Li, Lento Nagano, Koji Terashi,
- Abstract summary: Recent developments in the field of quantum machine learning have promoted the idea of incorporating physical symmetries in the structure of quantum circuits.
We provide a novel structure of QNN that is exactly invariant to both rotations and permutations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent developments in the field of quantum machine learning have promoted the idea of incorporating physical symmetries in the structure of quantum circuits. A crucial milestone in this area is the realization of $S_{n}$-permutation equivariant quantum neural networks (QNN) that are equivariant under permutations of input objects. In this work, we focus on encoding the rotational symmetry of point cloud datasets into the QNN. The key insight of the approach is that all rotationally invariant functions with vector inputs are equivalent to a function with inputs of vector inner products. We provide a novel structure of QNN that is exactly invariant to both rotations and permutations, with its efficacy demonstrated numerically in the problems of two-dimensional image classifications and identifying high-energy particle decays, produced by proton-proton collisions, with the $SO(1,3)$ Lorentz symmetry.
Related papers
- Image Classification with Rotation-Invariant Variational Quantum Circuits [0.0]
Variational quantum algorithms are gaining attention as an early application of Noisy Intermediate-Scale Quantum (NISQ) devices.
One of the main problems of variational methods lies in the phenomenon of Barren Plateaus, present in the optimization of variational parameters.
Adding inductive bias to the quantum models has been proposed as a potential solution to mitigate this problem, leading to a new field called Geometric Quantum Machine Learning.
arXiv Detail & Related papers (2024-03-22T08:26:31Z) - All you need is spin: SU(2) equivariant variational quantum circuits
based on spin networks [0.0]
Variational algorithms require architectures that naturally constrain the optimisation space to run efficiently.
We propose the use of spin networks, a form of directed tensor network invariant under a group transformation, to devise SU(2) equivariant quantum circuit ans"atze.
By changing to the basis that block diagonalises SU(2) group action, these networks provide a natural building block for constructing parameterised equivariant quantum circuits.
arXiv Detail & Related papers (2023-09-13T18:38:41Z) - Neural Functional Transformers [99.98750156515437]
This paper uses the attention mechanism to define a novel set of permutation equivariant weight-space layers called neural functional Transformers (NFTs)
NFTs respect weight-space permutation symmetries while incorporating the advantages of attention, which have exhibited remarkable success across multiple domains.
We also leverage NFTs to develop Inr2Array, a novel method for computing permutation invariant representations from the weights of implicit neural representations (INRs)
arXiv Detail & Related papers (2023-05-22T23:38:27Z) - Permutation Invariant Encodings for Quantum Machine Learning with Point
Cloud Data [0.27342795342528275]
We show a permutation invariant quantum encoding method, which exhibits superior generalisation performance.
We show that a permutation invariant encoding improves in accuracy as the number of points contained in the point cloud increases.
arXiv Detail & Related papers (2023-04-07T11:53:17Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Towards Neural Variational Monte Carlo That Scales Linearly with System
Size [67.09349921751341]
Quantum many-body problems are central to demystifying some exotic quantum phenomena, e.g., high-temperature superconductors.
The combination of neural networks (NN) for representing quantum states, and the Variational Monte Carlo (VMC) algorithm, has been shown to be a promising method for solving such problems.
We propose a NN architecture called Vector-Quantized Neural Quantum States (VQ-NQS) that utilizes vector-quantization techniques to leverage redundancies in the local-energy calculations of the VMC algorithm.
arXiv Detail & Related papers (2022-12-21T19:00:04Z) - Theory for Equivariant Quantum Neural Networks [0.0]
We present a theoretical framework to design equivariant quantum neural networks (EQNNs) for essentially any relevant symmetry group.
Our framework can be readily applied to virtually all areas of quantum machine learning.
arXiv Detail & Related papers (2022-10-16T15:42:21Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Symmetric Pruning in Quantum Neural Networks [111.438286016951]
Quantum neural networks (QNNs) exert the power of modern quantum machines.
QNNs with handcraft symmetric ansatzes generally experience better trainability than those with asymmetric ansatzes.
We propose the effective quantum neural tangent kernel (EQNTK) to quantify the convergence of QNNs towards the global optima.
arXiv Detail & Related papers (2022-08-30T08:17:55Z) - Variational Monte Carlo calculations of $\mathbf{A\leq 4}$ nuclei with
an artificial neural-network correlator ansatz [62.997667081978825]
We introduce a neural-network quantum state ansatz to model the ground-state wave function of light nuclei.
We compute the binding energies and point-nucleon densities of $Aleq 4$ nuclei as emerging from a leading-order pionless effective field theory Hamiltonian.
arXiv Detail & Related papers (2020-07-28T14:52:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.