Covariance Decomposition as a Universal Limit on Correlations in
Networks
- URL: http://arxiv.org/abs/2103.14840v2
- Date: Sat, 12 Feb 2022 20:33:59 GMT
- Title: Covariance Decomposition as a Universal Limit on Correlations in
Networks
- Authors: Salman Beigi, Marc-Olivier Renou
- Abstract summary: We show that in a network satisfying a certain condition, the covariance matrix of any feasible correlation can be decomposed as a summation of positive semidefinite matrices each of whose terms corresponds to a source in the network.
Our result is universal in the sense that it holds in any physical theory of correlation in networks, including the classical, quantum and all generalized probabilistic theories.
- Score: 2.9443230571766845
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Parties connected to independent sources through a network can generate
correlations among themselves. Notably, the space of feasible correlations for
a given network, depends on the physical nature of the sources and the
measurements performed by the parties. In particular, quantum sources give
access to nonlocal correlations that cannot be generated classically. In this
paper, we derive a universal limit on correlations in networks in terms of
their covariance matrix. We show that in a network satisfying a certain
condition, the covariance matrix of any feasible correlation can be decomposed
as a summation of positive semidefinite matrices each of whose terms
corresponds to a source in the network. Our result is universal in the sense
that it holds in any physical theory of correlation in networks, including the
classical, quantum and all generalized probabilistic theories.
Related papers
- Distributing quantum correlations through local operations and classical resources [0.0]
We present a robust, physically-motivated protocol by which global quantum correlations can be distributed to quantum memories.
In addition to this, said distribution is measurement-outcome independent, and the distribution is done using only bilocal unitary operations and projective measurements.
arXiv Detail & Related papers (2024-08-10T08:42:06Z) - Applications of flow models to the generation of correlated lattice QCD ensembles [69.18453821764075]
Machine-learned normalizing flows can be used in the context of lattice quantum field theory to generate statistically correlated ensembles of lattice gauge fields at different action parameters.
This work demonstrates how these correlations can be exploited for variance reduction in the computation of observables.
arXiv Detail & Related papers (2024-01-19T18:33:52Z) - Interpolation, Approximation and Controllability of Deep Neural Networks [18.311411538309425]
We consider two properties that arise from supervised learning, namely universal and universal approximation.
We give a characterisation of universal equivalence, showing that it holds for essentially any architecture with non-linearity.
arXiv Detail & Related papers (2023-09-12T07:29:47Z) - Correlations in Disordered Solvable Tensor Network States [0.0]
Solvable matrix product and projected entangled pair states evolved by dual and ternary-unitary quantum circuits have analytically accessible correlation functions.
We compute the average behavior of a physically motivated two-point equal-time correlation function with respect to random disordered tensor network states.
arXiv Detail & Related papers (2023-09-09T12:31:22Z) - Curvature-informed multi-task learning for graph networks [56.155331323304]
State-of-the-art graph neural networks attempt to predict multiple properties simultaneously.
We investigate a potential explanation for this phenomenon: the curvature of each property's loss surface significantly varies, leading to inefficient learning.
arXiv Detail & Related papers (2022-08-02T18:18:41Z) - The Separation Capacity of Random Neural Networks [78.25060223808936]
We show that a sufficiently large two-layer ReLU-network with standard Gaussian weights and uniformly distributed biases can solve this problem with high probability.
We quantify the relevant structure of the data in terms of a novel notion of mutual complexity.
arXiv Detail & Related papers (2021-07-31T10:25:26Z) - Full network nonlocality [68.8204255655161]
We introduce the concept of full network nonlocality, which describes correlations that necessitate all links in a network to distribute nonlocal resources.
We show that the most well-known network Bell test does not witness full network nonlocality.
More generally, we point out that established methods for analysing local and theory-independent correlations in networks can be combined in order to deduce sufficient conditions for full network nonlocality.
arXiv Detail & Related papers (2021-05-19T18:00:02Z) - Computable R\'enyi mutual information: Area laws and correlations [0.688204255655161]
The mutual information is a measure of classical and quantum correlations of great interest in quantum information.
Here, we consider alternative definitions based on R'enyi divergences.
We show that they obey a thermal area law in great generality, and that they upper bound all correlation functions.
arXiv Detail & Related papers (2021-03-02T13:33:42Z) - Deep Archimedean Copulas [98.96141706464425]
ACNet is a novel differentiable neural network architecture that enforces structural properties.
We show that ACNet is able to both approximate common Archimedean Copulas and generate new copulas which may provide better fits to data.
arXiv Detail & Related papers (2020-12-05T22:58:37Z) - The Importance of Being Correlated: Implications of Dependence in Joint
Spectral Inference across Multiple Networks [4.238478445823]
Spectral inference on multiple networks is a rapidly-developing subfield of graph statistics.
Recent work has demonstrated that joint, or simultaneous, spectral embedding of multiple independent networks can deliver more accurate estimation.
We present a generalized omnibus embedding methodology and provide a detailed analysis of this embedding across both independent and correlated networks.
arXiv Detail & Related papers (2020-08-01T03:43:52Z) - Coupling-based Invertible Neural Networks Are Universal Diffeomorphism
Approximators [72.62940905965267]
Invertible neural networks based on coupling flows (CF-INNs) have various machine learning applications such as image synthesis and representation learning.
Are CF-INNs universal approximators for invertible functions?
We prove a general theorem to show the equivalence of the universality for certain diffeomorphism classes.
arXiv Detail & Related papers (2020-06-20T02:07:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.