Efficient graph-diagonal characterization of noisy states distributed over quantum networks via Bell sampling
- URL: http://arxiv.org/abs/2512.06650v1
- Date: Sun, 07 Dec 2025 04:19:09 GMT
- Title: Efficient graph-diagonal characterization of noisy states distributed over quantum networks via Bell sampling
- Authors: Zherui Jerry Wang, Joshua Carlo A. Casapao, Naphan Benchasattabuse, Ananda G. Maity, Jordi Tura, Akihito Soeda, Michal HajduĊĦek, Rodney Van Meter, David Elkouss,
- Abstract summary: Graph states are an important class of entangled states that serve as a key resource for distributed information processing and communication in quantum networks.<n>We propose a protocol that utilizes a Bell sampling subroutine to characterize the diagonal elements in the graph basis of noisy graph states distributed across a network.
- Score: 0.10486135378491267
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph states are an important class of entangled states that serve as a key resource for distributed information processing and communication in quantum networks. In this work, we propose a protocol that utilizes a Bell sampling subroutine to characterize the diagonal elements in the graph basis of noisy graph states distributed across a network. Our approach offers significant advantages over direct diagonal estimation using unentangled single-qubit measurements in terms of scalability. Specifically, we prove that estimating the full vector of diagonal elements requires a sample complexity that scales linearly with the number of qubits ($\mathcal{O}(n)$), providing an exponential reduction in resource overhead compared to the best known $\mathcal{O}(2^n)$ scaling of direct estimation. Furthermore, we demonstrate that global properties, such as state fidelity, can be estimated with a sample complexity independent of the network size. Finally, we present numerical results indicating that the estimation in practice is more efficient than the derived theoretical bounds. Our work thus establishes a promising technique for efficiently estimating noisy graph states in large networks under realistic experimental conditions.
Related papers
- A Few Moments Please: Scalable Graphon Learning via Moment Matching [32.65958390286061]
We propose a novel, scalable graphon estimator that directly recovers the graphon via moment matching.<n>We also introduce MomentMixup, a data augmentation technique that performs mixup in the moment space to enhance graphon-based learning.
arXiv Detail & Related papers (2025-06-04T17:51:01Z) - SpectralGap: Graph-Level Out-of-Distribution Detection via Laplacian Eigenvalue Gaps [19.580332929984028]
We propose SpecGap, an effective post-hoc approach for OOD detection on graphs.<n> SpecGap achieves state-of-the-art performance across multiple benchmark datasets.
arXiv Detail & Related papers (2025-05-21T06:47:44Z) - Sparse Training of Discrete Diffusion Models for Graph Generation [45.103518022696996]
We introduce SparseDiff, a novel diffusion model based on the observation that almost all large graphs are sparse.
By selecting a subset of edges, SparseDiff effectively leverages sparse graph representations both during the noising process and within the denoising network.
Our model demonstrates state-of-the-art performance across multiple metrics on both small and large datasets.
arXiv Detail & Related papers (2023-11-03T16:50:26Z) - Multipartite Entanglement Distribution in Quantum Networks using Subgraph Complementations [8.194910516215462]
We propose a novel approach for distributing graph states across a quantum network.<n>We classify common classes of graph states, along with their optimal distribution time using subgraph complementations.
arXiv Detail & Related papers (2023-08-25T23:03:25Z) - Graph Out-of-Distribution Generalization with Controllable Data
Augmentation [51.17476258673232]
Graph Neural Network (GNN) has demonstrated extraordinary performance in classifying graph properties.
Due to the selection bias of training and testing data, distribution deviation is widespread.
We propose OOD calibration to measure the distribution deviation of virtual samples.
arXiv Detail & Related papers (2023-08-16T13:10:27Z) - Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs [131.53471236405628]
We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space.
We prove that those low dimensional representations constitute a convergent sequence of graphs and graph signals.
We observe that graphon pooling performs significantly better than other approaches proposed in the literature when dimensionality reduction ratios between layers are large.
arXiv Detail & Related papers (2022-12-15T22:11:34Z) - Efficient tensor network simulation of quantum many-body physics on
sparse graphs [0.0]
We study tensor network states defined on an underlying graph which is sparsely connected.
We find that message-passing inference algorithms can lead to efficient computation of local expectation values.
arXiv Detail & Related papers (2022-06-09T18:00:03Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Learning Optical Flow from a Few Matches [67.83633948984954]
We show that the dense correlation volume representation is redundant and accurate flow estimation can be achieved with only a fraction of elements in it.
Experiments show that our method can reduce computational cost and memory use significantly, while maintaining high accuracy.
arXiv Detail & Related papers (2021-04-05T21:44:00Z) - Semiparametric Nonlinear Bipartite Graph Representation Learning with
Provable Guarantees [106.91654068632882]
We consider the bipartite graph and formalize its representation learning problem as a statistical estimation problem of parameters in a semiparametric exponential family distribution.
We show that the proposed objective is strongly convex in a neighborhood around the ground truth, so that a gradient descent-based method achieves linear convergence rate.
Our estimator is robust to any model misspecification within the exponential family, which is validated in extensive experiments.
arXiv Detail & Related papers (2020-03-02T16:40:36Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.