Probability vector representation of the Schrödinger equation and Leggett-Garg-type experiments
- URL: http://arxiv.org/abs/2312.16281v3
- Date: Wed, 26 Jun 2024 06:02:58 GMT
- Title: Probability vector representation of the Schrödinger equation and Leggett-Garg-type experiments
- Authors: Masahiro Hotta, Sebastian Murk,
- Abstract summary: Leggett-Garg inequalities place bounds on the temporal correlations of a system based on the principles of macroscopic realism.
We propose a scheme to describe the dynamics of generic $N$-level quantum systems via a probability vector representation of the Schr"odinger equation.
We also define a precise notion of no-signaling in time (NSIT) for the probability distributions of noncommuting observables.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Leggett-Garg inequalities place bounds on the temporal correlations of a system based on the principles of macroscopic realism $\textit{per se}$ and noninvasive measurability. Their conventional formulation relies on the ensemble-averaged products of observables measured at different instants of time. However, a complete description that enables a precise understanding and captures all physically relevant features requires the study of probability distributions associated with noncommuting observables. In this article, we propose a scheme to describe the dynamics of generic $N$-level quantum systems ("qudits") via a probability vector representation of the Schr\"odinger equation and define a precise notion of no-signaling in time (NSIT) for the probability distributions of noncommuting observables. This provides a systematic way of identifying the interferences responsible for nonclassical behavior. In addition, we introduce an interference witness measure to quantify violations of NSIT for arbitrary general probabilistic states. For single-qubit systems, we pinpoint the pivotal relation that establishes a connection between the disturbance of observables incurred during a measurement and the resulting NSIT violation. For large-$N$ systems where a manual determination is infeasible, the classification of states as either NSIT-conforming or NSIT-violating may be performed by a machine learning algorithm. We present a proof-of-principle implementation of such an algorithm in which the classifier function is prepared via supervised learning using pseudorandomly generated training data sets composed of states whose corresponding classifications are known $\textit{a priori}$.
Related papers
- Decoherence and Probability [0.0]
Non-probabilistic accounts of the emergence of probability via decoherence are unconvincing.
An alternative account of the emergence of probability involves the combination of textitquasi-probabilistic emergence, via a partially interpreted decoherence model.
arXiv Detail & Related papers (2024-10-02T08:16:09Z) - The observer effect in quantum: the case of classification [0.0]
We show that sensory information becomes intricately entangled with observer states.
This framework lays the groundwork for a quantum-probability-based understanding of the observer effect.
arXiv Detail & Related papers (2024-06-12T15:23:53Z) - User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems [49.75149094527068]
We show that diffusion models can be adapted to make predictions and provide uncertainty quantification for chaotic dynamical systems.
We develop a probabilistic approximation scheme for the conditional score function which converges to the true distribution as the noise level decreases.
We are able to sample conditionally on nonlinear userdefined events at inference time, and matches data statistics even when sampling from the tails of the distribution.
arXiv Detail & Related papers (2023-06-13T03:42:03Z) - Bayesian Renormalization [68.8204255655161]
We present a fully information theoretic approach to renormalization inspired by Bayesian statistical inference.
The main insight of Bayesian Renormalization is that the Fisher metric defines a correlation length that plays the role of an emergent RG scale.
We provide insight into how the Bayesian Renormalization scheme relates to existing methods for data compression and data generation.
arXiv Detail & Related papers (2023-05-17T18:00:28Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - Probabilistic Systems with Hidden State and Unobservable Transitions [5.124254186899053]
We consider probabilistic systems with hidden state and unobservable transitions.
We present an algorithm for determining the most probable explanation given an observation.
We also present a method for parameter learning that adapts the probabilities of a given model based on an observation.
arXiv Detail & Related papers (2022-05-27T10:06:04Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - The Causal Neural Connection: Expressiveness, Learnability, and
Inference [125.57815987218756]
An object called structural causal model (SCM) represents a collection of mechanisms and sources of random variation of the system under investigation.
In this paper, we show that the causal hierarchy theorem (Thm. 1, Bareinboim et al., 2020) still holds for neural models.
We introduce a special type of SCM called a neural causal model (NCM), and formalize a new type of inductive bias to encode structural constraints necessary for performing causal inferences.
arXiv Detail & Related papers (2021-07-02T01:55:18Z) - Tractable Inference in Credal Sentential Decision Diagrams [116.6516175350871]
Probabilistic sentential decision diagrams are logic circuits where the inputs of disjunctive gates are annotated by probability values.
We develop the credal sentential decision diagrams, a generalisation of their probabilistic counterpart that allows for replacing the local probabilities with credal sets of mass functions.
For a first empirical validation, we consider a simple application based on noisy seven-segment display images.
arXiv Detail & Related papers (2020-08-19T16:04:34Z) - Unifying supervised learning and VAEs -- coverage, systematics and
goodness-of-fit in normalizing-flow based neural network models for
astro-particle reconstructions [0.0]
Statistical uncertainties, coverage, systematic uncertainties or a goodness-of-fit measure are often not calculated.
We show that a KL-divergence objective of the joint distribution of data and labels allows to unify supervised learning and variational autoencoders.
We discuss how to calculate coverage probabilities without numerical integration for specific "base-ordered" contours.
arXiv Detail & Related papers (2020-08-13T11:28:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.