Robustness measures for quantifying nonlocality
- URL: http://arxiv.org/abs/2311.07077v1
- Date: Mon, 13 Nov 2023 04:59:31 GMT
- Title: Robustness measures for quantifying nonlocality
- Authors: Kyunghyun Baek, Junghee Ryu, Jinhyoung Lee
- Abstract summary: We show that white-noise robustness does not fulfill monotonicity under local operation and shared randomness.
This study contributes to the resource theory of nonlocality and sheds light on comparing monotones by using the concept of inequivalence valid for all resource theories.
- Score: 0.3222802562733786
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We suggest generalized robustness for quantifying nonlocality and investigate
its properties by comparing it with white-noise and standard robustness
measures. As a result, we show that white-noise robustness does not fulfill
monotonicity under local operation and shared randomness, whereas the other
measures do. To compare the standard and generalized robustness measures, we
introduce the concept of inequivalence, which indicates a reversal in the order
relationship depending on the choice of monotones. From an operational
perspective, the inequivalence of monotones for resourceful objects implies the
absence of free operations that connect them. Applying this concept, we find
that standard and generalized robustness measures are inequivalent between
even- and odd-dimensional cases up to eight dimensions. This is obtained using
randomly performed CGLMP measurement settings in a maximally entangled state.
This study contributes to the resource theory of nonlocality and sheds light on
comparing monotones by using the concept of inequivalence valid for all
resource theories.
Related papers
- Violation of steering inequality for generalized equiangular measurements [0.6906005491572401]
We study bipartite quantum steering using a general class of measurement operators known as the generalized equiangular measurement (GEAM)
Our approach allows for the construction of steering inequalities that are applicable not only to informationally complete measurements but also to incomplete or lossy scenarios.
arXiv Detail & Related papers (2025-04-11T12:54:00Z) - A Unified Theory of Stochastic Proximal Point Methods without Smoothness [52.30944052987393]
Proximal point methods have attracted considerable interest owing to their numerical stability and robustness against imperfect tuning.
This paper presents a comprehensive analysis of a broad range of variations of the proximal point method (SPPM)
arXiv Detail & Related papers (2024-05-24T21:09:19Z) - Sobolev Space Regularised Pre Density Models [51.558848491038916]
We propose a new approach to non-parametric density estimation that is based on regularizing a Sobolev norm of the density.
This method is statistically consistent, and makes the inductive validation model clear and consistent.
arXiv Detail & Related papers (2023-07-25T18:47:53Z) - An entropic uncertainty principle for mixed states [0.0]
We provide a family of generalizations of the entropic uncertainty principle.
Results can be used to certify entanglement between trusted parties, or to bound the entanglement of a system with an untrusted environment.
arXiv Detail & Related papers (2023-03-20T18:31:53Z) - A complete and operational resource theory of measurement sharpness [2.4554686192257424]
We construct a resource theory of sharpness for finite-dimensional positive operator-valued measures (POVMs)
We show that our theory has maximal (i.e., sharp) elements, which are all equivalent, and coincide with the set of POVMs that admit a repeatable measurement.
We show that one POVM can be transformed into another by means of a sharpness-non-increasing operation if and only if the former is sharper than the latter with respect to all monotones.
arXiv Detail & Related papers (2023-03-14T09:33:55Z) - A signature of quantumness in pure decoherence control [0.0]
We study a decoherence reduction scheme that involves an intermediate measurement on the qubit in an equal superposition basis.
We show under what circumstances the scheme always leads to a gain of coherence on average, regardless of the time at which the measurement is performed.
We find that observing an average loss of coherence is a highly quantum effect, resulting from non-commutation of different terms in the Hamiltonian.
arXiv Detail & Related papers (2022-11-09T14:13:25Z) - On the Importance of Gradient Norm in PAC-Bayesian Bounds [92.82627080794491]
We propose a new generalization bound that exploits the contractivity of the log-Sobolev inequalities.
We empirically analyze the effect of this new loss-gradient norm term on different neural architectures.
arXiv Detail & Related papers (2022-10-12T12:49:20Z) - Large-Sample Properties of Non-Stationary Source Separation for Gaussian
Signals [2.2557806157585834]
We develop large-sample theory for NSS-JD, a popular method of non-stationary source separation.
We show that the consistency of the unmixing estimator and its convergence to a limiting Gaussian distribution at the standard square root rate are shown to hold.
Simulation experiments are used to verify the theoretical results and to study the impact of block length on the separation.
arXiv Detail & Related papers (2022-09-21T08:13:20Z) - Predicting Out-of-Domain Generalization with Neighborhood Invariance [59.05399533508682]
We propose a measure of a classifier's output invariance in a local transformation neighborhood.
Our measure is simple to calculate, does not depend on the test point's true label, and can be applied even in out-of-domain (OOD) settings.
In experiments on benchmarks in image classification, sentiment analysis, and natural language inference, we demonstrate a strong and robust correlation between our measure and actual OOD generalization.
arXiv Detail & Related papers (2022-07-05T14:55:16Z) - Intrinsic randomness under general quantum measurements [2.8101673772585736]
When measuring a state with von Neumann measurements, the intrinsic randomness can be quantified by the quantum coherence of the state on the measurement basis.
We propose an adversary scenario for general measurements with arbitrary input states, based on which, we characterize the intrinsic randomness.
Our results show that intrinsic randomness can quantify coherence under general measurements, which generalizes the result in the standard resource theory of state coherence.
arXiv Detail & Related papers (2022-03-16T13:53:20Z) - A Unified Framework for Multi-distribution Density Ratio Estimation [101.67420298343512]
Binary density ratio estimation (DRE) provides the foundation for many state-of-the-art machine learning algorithms.
We develop a general framework from the perspective of Bregman minimization divergence.
We show that our framework leads to methods that strictly generalize their counterparts in binary DRE.
arXiv Detail & Related papers (2021-12-07T01:23:20Z) - On the Double Descent of Random Features Models Trained with SGD [78.0918823643911]
We study properties of random features (RF) regression in high dimensions optimized by gradient descent (SGD)
We derive precise non-asymptotic error bounds of RF regression under both constant and adaptive step-size SGD setting.
We observe the double descent phenomenon both theoretically and empirically.
arXiv Detail & Related papers (2021-10-13T17:47:39Z) - Invariance Principle Meets Information Bottleneck for
Out-of-Distribution Generalization [77.24152933825238]
We show that for linear classification tasks we need stronger restrictions on the distribution shifts, or otherwise OOD generalization is impossible.
We prove that a form of the information bottleneck constraint along with invariance helps address key failures when invariant features capture all the information about the label and also retains the existing success when they do not.
arXiv Detail & Related papers (2021-06-11T20:42:27Z) - GroupifyVAE: from Group-based Definition to VAE-based Unsupervised
Representation Disentanglement [91.9003001845855]
VAE-based unsupervised disentanglement can not be achieved without introducing other inductive bias.
We address VAE-based unsupervised disentanglement by leveraging the constraints derived from the Group Theory based definition as the non-probabilistic inductive bias.
We train 1800 models covering the most prominent VAE-based models on five datasets to verify the effectiveness of our method.
arXiv Detail & Related papers (2021-02-20T09:49:51Z) - Fractional norms and quasinorms do not help to overcome the curse of
dimensionality [62.997667081978825]
Using of the Manhattan distance and even fractional quasinorms lp can help to overcome the curse of dimensionality in classification problems.
A systematic comparison shows that the difference of the performance of kNN based on lp for p=2, 1, and 0.5 is statistically insignificant.
arXiv Detail & Related papers (2020-04-29T14:30:12Z) - Generalized Sliced Distances for Probability Distributions [47.543990188697734]
We introduce a broad family of probability metrics, coined as Generalized Sliced Probability Metrics (GSPMs)
GSPMs are rooted in the generalized Radon transform and come with a unique geometric interpretation.
We consider GSPM-based gradient flows for generative modeling applications and show that under mild assumptions, the gradient flow converges to the global optimum.
arXiv Detail & Related papers (2020-02-28T04:18:00Z) - Geometric Formulation of Universally Valid Uncertainty Relation for
Error [1.696974372855528]
We present a new geometric formulation of uncertainty relation valid for any quantum measurements of statistical nature.
Owing to its simplicity and tangibility, our relation is universally valid and experimentally viable.
arXiv Detail & Related papers (2020-02-10T18:31:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.