Learning Time-Varying Quantum Lossy Channels
- URL: http://arxiv.org/abs/2504.12810v1
- Date: Thu, 17 Apr 2025 10:15:47 GMT
- Title: Learning Time-Varying Quantum Lossy Channels
- Authors: Angela Rosy Morgillo, Stefano Mancini, Massimiliano F. Sacchi, Chiara Macchiavello,
- Abstract summary: We employ neural networks to classify, regress, and forecast the behavior of time-varying quantum channels.<n>The networks achieve at least 87% of accuracy in distinguishing between non-Markovian, Markovian, memoryless, compound, and deterministic channels.
- Score: 2.499907423888049
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time-varying quantum channels are essential for modeling realistic quantum systems with evolving noise properties. Here, we consider Gaussian lossy channels varying from one use to another and we employ neural networks to classify, regress, and forecast the behavior of these channels from their Choi-Jamiolkowski states. The networks achieve at least 87% of accuracy in distinguishing between non-Markovian, Markovian, memoryless, compound, and deterministic channels. In regression tasks, the model accurately reconstructs the loss parameter sequences, and in forecasting, it predicts future values, with improved performance as the memory parameter approaches 1 for Markovian channels. These results demonstrate the potential of neural networks in characterizing and predicting the dynamics of quantum channels.
Related papers
- Hybrid Quantum Recurrent Neural Network For Remaining Useful Life Prediction [67.410870290301]
We introduce a Hybrid Quantum Recurrent Neural Network framework, combining Quantum Long Short-Term Memory layers with classical dense layers for Remaining Useful Life forecasting.
Experimental results demonstrate that, despite having fewer trainable parameters, the Hybrid Quantum Recurrent Neural Network achieves up to a 5% improvement over a Recurrent Neural Network.
arXiv Detail & Related papers (2025-04-29T14:41:41Z) - Dreaming Learning [41.94295877935867]
Introducing new information to a machine learning system can interfere with previously stored data.<n>We propose a training algorithm inspired by Stuart Kauffman's notion of the Adjacent Possible.<n>It predisposes the neural network to smoothly accept and integrate data sequences with different statistical characteristics than expected.
arXiv Detail & Related papers (2024-10-23T09:17:31Z) - Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Detecting Markovianity of Quantum Processes via Recurrent Neural Networks [0.0]
We present a novel methodology utilizing Recurrent Neural Networks (RNNs) to classify Markovian and non-Markovian quantum processes.<n>The model exhibits exceptional accuracy, surpassing 95%, across diverse scenarios.
arXiv Detail & Related papers (2024-06-11T13:05:36Z) - Improving Continuous-variable Quantum Channels with Unitary Averaging [37.69303106863453]
We present a scheme of passive linear optical unitary averaging for protecting unknown Gaussian states transmitted through an optical channel.
The scheme reduces the effect of phase noise on purity, squeezing and entanglement, thereby enhancing the channel via a probabilistic error correcting protocol.
arXiv Detail & Related papers (2023-11-17T10:10:19Z) - Quantum State Reconstruction in a Noisy Environment via Deep Learning [0.9012198585960443]
We consider the tasks of reconstructing and classifying quantum states corrupted by an unknown noisy channel.
We show how such an approach can be used to recover with fidelities exceeding 99%.
We also consider the task of distinguishing between different quantum noisy channels, and show how a neural network-based classifier is able to solve such a classification problem with perfect accuracy.
arXiv Detail & Related papers (2023-09-21T10:03:30Z) - A Comparison of Neural Networks for Wireless Channel Prediction [10.721189858694398]
It is unclear which neural network-based scheme provides the best performance in terms of prediction quality, training complexity and practical feasibility.
This paper first provides an overview of state-of-the-art neural networks applicable to channel prediction and compares their performance in terms of prediction quality.
The advantages and disadvantages of each neural network are discussed and guidelines for selecting the best-suited neural network in channel prediction applications are given.
arXiv Detail & Related papers (2023-08-27T06:39:46Z) - Learning to Learn with Generative Models of Neural Network Checkpoints [71.06722933442956]
We construct a dataset of neural network checkpoints and train a generative model on the parameters.
We find that our approach successfully generates parameters for a wide range of loss prompts.
We apply our method to different neural network architectures and tasks in supervised and reinforcement learning.
arXiv Detail & Related papers (2022-09-26T17:59:58Z) - Decentralizing Feature Extraction with Quantum Convolutional Neural
Network for Automatic Speech Recognition [101.69873988328808]
We build upon a quantum convolutional neural network (QCNN) composed of a quantum circuit encoder for feature extraction.
An input speech is first up-streamed to a quantum computing server to extract Mel-spectrogram.
The corresponding convolutional features are encoded using a quantum circuit algorithm with random parameters.
The encoded features are then down-streamed to the local RNN model for the final recognition.
arXiv Detail & Related papers (2020-10-26T03:36:01Z) - Hidden Markov Neural Networks [5.8317379706611385]
We define an evolving in-time Bayesian neural network called a Hidden Markov Neural Network.
Experiments on MNIST, dynamic classification tasks, and next-frame forecasting in videos demonstrate that Hidden Markov Neural Networks provide strong predictive performance.
arXiv Detail & Related papers (2020-04-15T09:18:18Z) - Data-Driven Symbol Detection via Model-Based Machine Learning [117.58188185409904]
We review a data-driven framework to symbol detection design which combines machine learning (ML) and model-based algorithms.
In this hybrid approach, well-known channel-model-based algorithms are augmented with ML-based algorithms to remove their channel-model-dependence.
Our results demonstrate that these techniques can yield near-optimal performance of model-based algorithms without knowing the exact channel input-output statistical relationship.
arXiv Detail & Related papers (2020-02-14T06:58:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.