Fisher information of correlated stochastic processes
- URL: http://arxiv.org/abs/2206.00463v2
- Date: Wed, 7 Jun 2023 13:45:16 GMT
- Title: Fisher information of correlated stochastic processes
- Authors: Marco Radaelli, Gabriel T. Landi, Kavan Modi, Felix C. Binder
- Abstract summary: We prove two results concerning the estimation of parameters encoded in a memoryful process.
First, we show that for processes with finite Markov order, the Fisher information is always linear in the number of outcomes.
Second, we prove with suitable examples that correlations do not necessarily enhance the metrological precision.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Many real-world tasks include some kind of parameter estimation, i.e.,
determination of a parameter encoded in a probability distribution. Often, such
probability distributions arise from stochastic processes. For a stationary
stochastic process with temporal correlations, the random variables that
constitute it are identically distributed but not independent. This is the
case, for instance, for quantum continuous measurements. In this paper we prove
two fundamental results concerning the estimation of parameters encoded in a
memoryful stochastic process. First, we show that for processes with finite
Markov order, the Fisher information is always asymptotically linear in the
number of outcomes, and determined by the conditional distribution of the
process' Markov order. Second, we prove with suitable examples that
correlations do not necessarily enhance the metrological precision. In fact, we
show that unlike for entropic information quantities, in general nothing can be
said about the sub- or super-additivity of the joint Fisher information, in the
presence of correlations. We discuss how the type of correlations in the
process affects the scaling. We then apply these results to the case of
thermometry on a spin chain.
Related papers
- Logistic-beta processes for dependent random probabilities with beta marginals [58.91121576998588]
We propose a novel process called the logistic-beta process, whose logistic transformation yields a process with common beta marginals.
It can model dependence on both discrete and continuous domains, such as space or time, and has a flexible dependence structure through correlation kernels.
We illustrate the benefits through nonparametric binary regression and conditional density estimation examples, both in simulation studies and in a pregnancy outcome application.
arXiv Detail & Related papers (2024-02-10T21:41:32Z) - Parameter estimation for quantum jump unraveling [0.0]
We consider the estimation of parameters encoded in the measurement record of a continuously monitored quantum system in the jump.
Here, it is generally difficult to assess the precision of the estimation procedure via the Fisher Information due to intricate temporal correlations and memory effects.
arXiv Detail & Related papers (2024-02-09T17:14:38Z) - Ultimate limit on learning non-Markovian behavior: Fisher information
rate and excess information [0.0]
We address the fundamental limits of learning unknown parameters of any process from time-series data.
We discover exact closed-form expressions for how optimal inference scales with observation length.
arXiv Detail & Related papers (2023-10-06T01:53:42Z) - Stochastic metrology and the empirical distribution [0.0]
We study the problem of parameter estimation in time series stemming from general processes, where the outcomes may exhibit arbitrary correlations.
We derive practical formulas for the resulting Fisher information for various scenarios, from generic stationary processes to discrete-time Markov chains to continuous-time classical master equations.
arXiv Detail & Related papers (2023-05-25T21:22:34Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Statistical Efficiency of Score Matching: The View from Isoperimetry [96.65637602827942]
We show a tight connection between statistical efficiency of score matching and the isoperimetric properties of the distribution being estimated.
We formalize these results both in the sample regime and in the finite regime.
arXiv Detail & Related papers (2022-10-03T06:09:01Z) - On the Dynamics of Inference and Learning [0.0]
We present a treatment of this Bayesian updating process as a continuous dynamical system.
We show that when the Cram'er-Rao bound is saturated the learning rate is governed by a simple $1/T$ power-law.
arXiv Detail & Related papers (2022-04-19T18:04:36Z) - Equivariance Discovery by Learned Parameter-Sharing [153.41877129746223]
We study how to discover interpretable equivariances from data.
Specifically, we formulate this discovery process as an optimization problem over a model's parameter-sharing schemes.
Also, we theoretically analyze the method for Gaussian data and provide a bound on the mean squared gap between the studied discovery scheme and the oracle scheme.
arXiv Detail & Related papers (2022-04-07T17:59:19Z) - Nonparametric Conditional Local Independence Testing [69.31200003384122]
Conditional local independence is an independence relation among continuous time processes.
No nonparametric test of conditional local independence has been available.
We propose such a nonparametric test based on double machine learning.
arXiv Detail & Related papers (2022-03-25T10:31:02Z) - Contrastive learning of strong-mixing continuous-time stochastic
processes [53.82893653745542]
Contrastive learning is a family of self-supervised methods where a model is trained to solve a classification task constructed from unlabeled data.
We show that a properly constructed contrastive learning task can be used to estimate the transition kernel for small-to-mid-range intervals in the diffusion case.
arXiv Detail & Related papers (2021-03-03T23:06:47Z) - On the Estimation of Information Measures of Continuous Distributions [25.395010130602287]
estimation of information measures of continuous distributions based on samples is a fundamental problem in statistics and machine learning.
We provide confidence bounds for simple histogram based estimation of differential entropy from a fixed number of samples.
Our focus is on differential entropy, but we provide examples that show that similar results hold for mutual information and relative entropy as well.
arXiv Detail & Related papers (2020-02-07T15:36:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.