Structured Unitary Tensor Network Representations for Circuit-Efficient Quantum Data Encoding
- URL: http://arxiv.org/abs/2602.16266v1
- Date: Wed, 18 Feb 2026 08:36:07 GMT
- Title: Structured Unitary Tensor Network Representations for Circuit-Efficient Quantum Data Encoding
- Authors: Guang Lin, Toshihisa Tanaka, Qibin Zhao,
- Abstract summary: TNQE is a circuit-efficient quantum data encoding framework built on structured unitary tensor network representations.<n> TNQE compiles the resulting tensor cores into an encoding circuit through two complementary core-to-circuit strategies.<n>Across a range of benchmarks, TNQE achieves encoding circuits as shallow as $0.04times$ the depth of amplitude encoding.
- Score: 33.951713386684425
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Encoding classical data into quantum states is a central bottleneck in quantum machine learning: many widely used encodings are circuit-inefficient, requiring deep circuits and substantial quantum resources, which limits scalability on quantum hardware. In this work, we propose TNQE, a circuit-efficient quantum data encoding framework built on structured unitary tensor network (TN) representations. TNQE first represents each classical input via a TN decomposition and then compiles the resulting tensor cores into an encoding circuit through two complementary core-to-circuit strategies. To make this compilation trainable while respecting the unitary nature of quantum operations, we introduce a unitary-aware constraint that parameterizes TN cores as learnable block unitaries, enabling them to be directly optimized and directly encoded as quantum operators. The proposed TNQE framework enables explicit control over circuit depth and qubit resources, allowing the construction of shallow, resource-efficient circuits. Across a range of benchmarks, TNQE achieves encoding circuits as shallow as $0.04\times$ the depth of amplitude encoding, while naturally scaling to high-resolution images ($256 \times 256$) and demonstrating practical feasibility on real quantum hardware.
Related papers
- Domain-Aware Quantum Circuit for QML [0.7999703756441755]
We present a Domain-Aware Quantum Circuit (DAQC) that leverages image priors to guide locality-preserving encoding and entanglement.<n>The design employs interleaved encode--train cycles, where entanglement is applied among qubits hosting neighboring pixels, aligned to device connectivity.<n>We evaluate DAQC on MNIST, FashionMentangle, and PneumoniaMNIST datasets.
arXiv Detail & Related papers (2025-12-19T17:02:58Z) - Fidelity-Preserving Quantum Encoding for Quantum Neural Networks [2.9621136443259872]
Existing encoding schemes discard spatial and semantic information when adapting high-dimensional images to the limited qubits of Noisy Intermediate-Scale QuantumNISQ devices.<n>We propose a Fidelity-Preserving Quantum Preserving framework that performs near lossless data compression and quantum encoding.<n> Experimental results show that FPQE performs comparably to conventional methods on simple datasets such as MNIST, while achieving clear improvements on more complex ones.
arXiv Detail & Related papers (2025-11-19T11:44:39Z) - Quantum-Efficient Convolution through Sparse Matrix Encoding and Low-Depth Inner Product Circuits [0.0]
We present a resource-efficient quantum algorithm that reformulates the convolution product as a structured matrix multiplication.<n>We construct a quantum framework wherein sparse input patches are prepared using optimized key-value QRAM state encoding.<n>Our architecture supports batched convolution across multiple filters using a generalized SWAP circuit.
arXiv Detail & Related papers (2025-07-25T20:08:12Z) - A Qubit-Efficient Hybrid Quantum Encoding Mechanism for Quantum Machine Learning [11.861417859173859]
Quantum Principal Geodesic Analysis (qPGA) is a non-invertible method for dimensionality reduction and qubit-efficient encoding.<n>We show that qPGA preserves local structure more effectively than both quantum and hybrid autoencoders.<n>In downstream QML classification tasks, qPGA can achieve over 99% accuracy and F1-score on MNIST and Fashion-MNIST, outperforming quantum-dependent baselines.
arXiv Detail & Related papers (2025-06-24T03:09:16Z) - HQViT: Hybrid Quantum Vision Transformer for Image Classification [48.72766405978677]
We propose a Hybrid Quantum Vision Transformer (HQViT) to accelerate model training while enhancing model performance.<n>HQViT introduces whole-image processing with amplitude encoding to better preserve global image information without additional positional encoding.<n>Experiments across various computer vision datasets demonstrate that HQViT outperforms existing models, achieving a maximum improvement of up to $10.9%$ (on the MNIST 10-classification task) over the state of the art.
arXiv Detail & Related papers (2025-04-03T16:13:34Z) - Integrated Encoding and Quantization to Enhance Quanvolutional Neural Networks [2.789685107745028]
We propose two ways to enhance the efficiency of quanvolutional models.
First, we propose a flexible data quantization approach with memoization, applicable to any encoding method.
Second, we introduce a new integrated encoding strategy, which combines the encoding and processing steps in a single circuit.
arXiv Detail & Related papers (2024-10-08T07:57:13Z) - Quantum Compiling with Reinforcement Learning on a Superconducting Processor [55.135709564322624]
We develop a reinforcement learning-based quantum compiler for a superconducting processor.
We demonstrate its capability of discovering novel and hardware-amenable circuits with short lengths.
Our study exemplifies the codesign of the software with hardware for efficient quantum compilation.
arXiv Detail & Related papers (2024-06-18T01:49:48Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Deep Quantum Error Correction [73.54643419792453]
Quantum error correction codes (QECC) are a key component for realizing the potential of quantum computing.
In this work, we efficiently train novel emphend-to-end deep quantum error decoders.
The proposed method demonstrates the power of neural decoders for QECC by achieving state-of-the-art accuracy.
arXiv Detail & Related papers (2023-01-27T08:16:26Z) - Decomposition of Matrix Product States into Shallow Quantum Circuits [62.5210028594015]
tensor network (TN) algorithms can be mapped to parametrized quantum circuits (PQCs)
We propose a new protocol for approximating TN states using realistic quantum circuits.
Our results reveal one particular protocol, involving sequential growth and optimization of the quantum circuit, to outperform all other methods.
arXiv Detail & Related papers (2022-09-01T17:08:41Z) - Quantum circuit debugging and sensitivity analysis via local inversions [62.997667081978825]
We present a technique that pinpoints the sections of a quantum circuit that affect the circuit output the most.
We demonstrate the practicality and efficacy of the proposed technique by applying it to example algorithmic circuits implemented on IBM quantum machines.
arXiv Detail & Related papers (2022-04-12T19:39:31Z) - QTN-VQC: An End-to-End Learning framework for Quantum Neural Networks [71.14713348443465]
We introduce a trainable quantum tensor network (QTN) for quantum embedding on a variational quantum circuit (VQC)
QTN enables an end-to-end parametric model pipeline, namely QTN-VQC, from the generation of quantum embedding to the output measurement.
Our experiments on the MNIST dataset demonstrate the advantages of QTN for quantum embedding over other quantum embedding approaches.
arXiv Detail & Related papers (2021-10-06T14:44:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.