A Sierpinski Triangle Fermion-to-Qubit Transform
- URL: http://arxiv.org/abs/2409.04348v1
- Date: Fri, 6 Sep 2024 15:29:09 GMT
- Title: A Sierpinski Triangle Fermion-to-Qubit Transform
- Authors: Brent Harrison, Mitchell Chiew, Jason Necaise, Andrew Projansky, Sergii Strelchuk, James D. Whitfield,
- Abstract summary: We present a novel fermion-to-qubit encoding based on the recently discovered "Sierpinski tree" data structure.
This encoding has the additional benefit of encoding the fermionic states as computational basis states.
- Score: 0.876484595372461
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In order to simulate a system of fermions on a quantum computer, it is necessary to represent the fermionic states and operators on qubits. This can be accomplished in multiple ways, including the well-known Jordan-Wigner transform, as well as the parity, Bravyi-Kitaev, and ternary tree encodings. Notably, the Bravyi-Kitaev encoding can be described in terms of a classical data structure, the Fenwick tree. Here we establish a correspondence between a class of classical data structures similar to the Fenwick tree, and a class of one-to-one fermion-to-qubit transforms. We present a novel fermion-to-qubit encoding based on the recently discovered "Sierpinski tree" data structure, which matches the operator locality of the ternary tree encoding, and has the additional benefit of encoding the fermionic states as computational basis states. This is analogous to the formulation of the Bravyi-Kitaev encoding in terms of the Fenwick tree.
Related papers
- Dynamically generated concatenated codes and their phase diagrams [0.0]
We formulate code concatenation as the action of a unitary quantum circuit on an expanding tree geometry.
In the presence of bulk errors, the coding phase is a type of spin glass, characterized by a distribution of failure probabilities.
arXiv Detail & Related papers (2024-09-20T17:51:50Z) - Interpretable Spectral Variational AutoEncoder (ISVAE) for time series
clustering [48.0650332513417]
We introduce a novel model that incorporates an interpretable bottleneck-termed the Filter Bank (FB)-at the outset of a Variational Autoencoder (VAE)
This arrangement compels the VAE to attend on the most informative segments of the input signal.
By deliberately constraining the VAE with this FB, we promote the development of an encoding that is discernible, separable, and of reduced dimensionality.
arXiv Detail & Related papers (2023-10-18T13:06:05Z) - Spherical and Hyperbolic Toric Topology-Based Codes On Graph Embedding
for Ising MRF Models: Classical and Quantum Topology Machine Learning [0.11805137592431453]
The paper introduces the application of information geometry to describe the ground states of Ising models.
The approach establishes a connection between machine learning and error-correcting coding.
arXiv Detail & Related papers (2023-07-28T19:38:13Z) - Homological Quantum Rotor Codes: Logical Qubits from Torsion [51.9157257936691]
homological quantum rotor codes allow one to encode both logical rotors and logical qudits in the same block of code.
We show that the $0$-$pi$-qubit as well as Kitaev's current-mirror qubit are indeed small examples of such codes.
arXiv Detail & Related papers (2023-03-24T00:29:15Z) - Characterizing Intrinsic Compositionality in Transformers with Tree
Projections [72.45375959893218]
neural models like transformers can route information arbitrarily between different parts of their input.
We show that transformers for three different tasks become more treelike over the course of training.
These trees are predictive of model behavior, with more tree-like models generalizing better on tests of compositional generalization.
arXiv Detail & Related papers (2022-11-02T17:10:07Z) - Transformer with Tree-order Encoding for Neural Program Generation [8.173517923612426]
We introduce a tree-based positional encoding and a shared natural-language subword vocabulary for Transformers.
Our findings suggest that employing a tree-based positional encoding in combination with a shared natural-language subword vocabulary improves generation performance over sequential positional encodings.
arXiv Detail & Related papers (2022-05-30T12:27:48Z) - Incorporating Constituent Syntax for Coreference Resolution [50.71868417008133]
We propose a graph-based method to incorporate constituent syntactic structures.
We also explore to utilise higher-order neighbourhood information to encode rich structures in constituent trees.
Experiments on the English and Chinese portions of OntoNotes 5.0 benchmark show that our proposed model either beats a strong baseline or achieves new state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T07:40:42Z) - Dense Coding with Locality Restriction for Decoder: Quantum Encoders vs.
Super-Quantum Encoders [67.12391801199688]
We investigate dense coding by imposing various locality restrictions to our decoder.
In this task, the sender Alice and the receiver Bob share an entangled state.
arXiv Detail & Related papers (2021-09-26T07:29:54Z) - Custom fermionic codes for quantum simulation [0.0]
We show that locality may in fact be too strict of a condition and the size of operators can be reduced by encoding the system quasi-locally.
We give examples relevant to lattice models of condensed matter and systems relevant to quantum gravity such as SYK models.
arXiv Detail & Related papers (2020-09-24T17:59:14Z) - Mitigating Errors in Local Fermionic Encodings [6.0409040218619685]
We show that fermionic encodings with low-weight representations of local fermionic operators can still exhibit error mitigating properties.
In particular when undetectable errors correspond to "natural" fermionic noise.
This suggests that even when employing low-weight fermionic encodings, error rates can be suppressed in a similar fashion to high distance codes.
arXiv Detail & Related papers (2020-03-16T11:31:43Z) - Tree-structured Attention with Hierarchical Accumulation [103.47584968330325]
"Hierarchical Accumulation" encodes parse tree structures into self-attention at constant time complexity.
Our approach outperforms SOTA methods in four IWSLT translation tasks and the WMT'14 English-German translation task.
arXiv Detail & Related papers (2020-02-19T08:17:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.