Characteristic Function of the Tsallis $q$-Gaussian and Its Applications
in Measurement and Metrology
- URL: http://arxiv.org/abs/2303.08615v2
- Date: Thu, 18 May 2023 16:18:46 GMT
- Title: Characteristic Function of the Tsallis $q$-Gaussian and Its Applications
in Measurement and Metrology
- Authors: Viktor Witkovsk\'y
- Abstract summary: The Tsallis $q$-Gaussian distribution is a powerful generalization of the standard Gaussian distribution.
This paper presents the characteristic function of a linear combination of independent $q$-Gaussian random variables.
It provides an alternative computational procedure to the Monte Carlo method for uncertainty analysis.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Tsallis $q$-Gaussian distribution is a powerful generalization of the
standard Gaussian distribution and is commonly used in various fields,
including non-extensive statistical mechanics, financial markets and image
processing. It belongs to the $q$-distribution family, which is characterized
by a non-additive entropy. Due to their versatility and practicality,
$q$-Gaussians are a natural choice for modeling input quantities in measurement
models. This paper presents the characteristic function of a linear combination
of independent $q$-Gaussian random variables and proposes a numerical method
for its inversion. The proposed technique makes it possible to determine the
exact probability distribution of the output quantity in linear measurement
models, with the input quantities modeled as independent $q$-Gaussian random
variables. It provides an alternative computational procedure to the Monte
Carlo method for uncertainty analysis through the propagation of distributions.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Scaling and renormalization in high-dimensional regression [72.59731158970894]
This paper presents a succinct derivation of the training and generalization performance of a variety of high-dimensional ridge regression models.
We provide an introduction and review of recent results on these topics, aimed at readers with backgrounds in physics and deep learning.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Fusion of Gaussian Processes Predictions with Monte Carlo Sampling [61.31380086717422]
In science and engineering, we often work with models designed for accurate prediction of variables of interest.
Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes.
arXiv Detail & Related papers (2024-03-03T04:21:21Z) - Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Dynamical System Identification, Model Selection and Model Uncertainty Quantification by Bayesian Inference [0.8388591755871735]
This study presents a Bayesian maximum textitaposteriori (MAP) framework for dynamical system identification from time-series data.
arXiv Detail & Related papers (2024-01-30T12:16:52Z) - Generating random Gaussian states [0.4604003661048266]
We show that the eigenvalues of an RQCM converge to a shifted semicircular distribution in the limit of a large number of modes.
We show that the symplectic eigenvalues of an RQCM converge to a probability distribution that can be characterized using free probability.
arXiv Detail & Related papers (2024-01-24T13:06:57Z) - Bayesian Non-linear Latent Variable Modeling via Random Fourier Features [7.856578780790166]
We present a method to perform Markov chain Monte Carlo inference for generalized nonlinear latent variable modeling.
Inference forVMs is computationally tractable only when the data likelihood is Gaussian.
We show that we can generalizeVMs to non-Gaussian observations, such as Poisson, negative binomial, and multinomial distributions.
arXiv Detail & Related papers (2023-06-14T08:42:10Z) - Bayesian Inference for the Multinomial Probit Model under Gaussian Prior
Distribution [0.0]
Multinomial probit (mnp) models are fundamental and widely-applied regression models for categorical data.
Fasano and Durante (2022) proved that the class of unified skew-normal distributions is conjugate to several mnp sampling models.
We adapt the results for a popular special case: the discrete-choice mnp model under zero mean and independent Gaussian priors.
arXiv Detail & Related papers (2022-06-01T19:10:41Z) - Wrapped Distributions on homogeneous Riemannian manifolds [58.720142291102135]
Control over distributions' properties, such as parameters, symmetry and modality yield a family of flexible distributions.
We empirically validate our approach by utilizing our proposed distributions within a variational autoencoder and a latent space network model.
arXiv Detail & Related papers (2022-04-20T21:25:21Z) - A method to integrate and classify normal distributions [0.0]
We present results and open-source software that provide the probability in any domain of a normal in any dimensions with any parameters.
We demonstrate these tools with vision research applications of detecting occluding objects in natural scenes, and detecting camouflage.
arXiv Detail & Related papers (2020-12-23T05:45:41Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.