Exponential Negation of a Probability Distribution
- URL: http://arxiv.org/abs/2010.11533v2
- Date: Thu, 1 Apr 2021 14:05:56 GMT
- Title: Exponential Negation of a Probability Distribution
- Authors: Qinyuan Wu, Yong Deng and Neal Xiong
- Abstract summary: The proposed negation can be seen as a kind of geometry negation.
The number of iterations of convergence is inversely proportional to the number of elements in the distribution.
- Score: 7.895866278697778
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Negation operation is important in intelligent information processing.
Different with existing arithmetic negation, an exponential negation is
presented in this paper. The new negation can be seen as a kind of geometry
negation. Some basic properties of the proposed negation is investigated, we
find that the fix point is the uniform probability distribution. The negation
is an entropy increase operation and all the probability distributions will
converge to the uniform distribution after multiple negation iterations. The
number of iterations of convergence is inversely proportional to the number of
elements in the distribution. Some numerical examples are used to illustrate
the efficiency of the proposed negation.
Related papers
- Negative Probability [0.6906005491572398]
Negative probabilities arise primarily in quantum theory and computing.
Negative probabilities arise as mixing distributions of unobserved latent variables in Bayesian modeling.
Examples of dual densities with negative mixing measures are provided.
arXiv Detail & Related papers (2024-05-05T20:09:49Z) - The negation of permutation mass function [3.1981440103815717]
Existing negation methods are mainly applied in probability theory, evidence theory and complex evidence theory.
How to apply the concept of negation to random permutation sets theory has not been studied.
arXiv Detail & Related papers (2024-03-11T07:44:59Z) - Inference via Interpolation: Contrastive Representations Provably Enable Planning and Inference [110.47649327040392]
Given time series data, how can we answer questions like "what will happen in the future?" and "how did we get here?"
We show how these questions can have compact, closed form solutions in terms of learned representations.
arXiv Detail & Related papers (2024-03-06T22:27:30Z) - Simulating counterfactuals [1.3654846342364302]
Counterfactual inference considers a hypothetical intervention in a parallel world that shares some evidence with the factual world.
We present an algorithm for simulating values from a counterfactual distribution where conditions can be set on both discrete and continuous variables.
arXiv Detail & Related papers (2023-06-27T09:34:32Z) - Debiased Contrastive Learning of Unsupervised Sentence Representations [88.58117410398759]
Contrastive learning is effective in improving pre-trained language models (PLM) to derive high-quality sentence representations.
Previous works mostly adopt in-batch negatives or sample from training data at random.
We present a new framework textbfDCLR to alleviate the influence of these improper negatives.
arXiv Detail & Related papers (2022-05-02T05:07:43Z) - Equivariance Discovery by Learned Parameter-Sharing [153.41877129746223]
We study how to discover interpretable equivariances from data.
Specifically, we formulate this discovery process as an optimization problem over a model's parameter-sharing schemes.
Also, we theoretically analyze the method for Gaussian data and provide a bound on the mean squared gap between the studied discovery scheme and the oracle scheme.
arXiv Detail & Related papers (2022-04-07T17:59:19Z) - Investigating the Role of Negatives in Contrastive Representation
Learning [59.30700308648194]
Noise contrastive learning is a popular technique for unsupervised representation learning.
We focus on disambiguating the role of one of these parameters: the number of negative examples.
We find that the results broadly agree with our theory, while our vision experiments are murkier with performance sometimes even being insensitive to the number of negatives.
arXiv Detail & Related papers (2021-06-18T06:44:16Z) - Loss function based second-order Jensen inequality and its application
to particle variational inference [112.58907653042317]
Particle variational inference (PVI) uses an ensemble of models as an empirical approximation for the posterior distribution.
PVI iteratively updates each model with a repulsion force to ensure the diversity of the optimized models.
We derive a novel generalization error bound and show that it can be reduced by enhancing the diversity of models.
arXiv Detail & Related papers (2021-06-09T12:13:51Z) - Contracting and Involutive Negations of Probability Distributions [0.0]
We show that Yager negator plays a crucial role in the definition of pd-independent linear negators.
We introduce an involutive negator in the class of pd-dependent negators.
arXiv Detail & Related papers (2021-03-30T08:58:08Z) - Generating Negations of Probability Distributions [0.0]
We consider negations of probability distributions as point-by-point transformations of pd.
We give a characterization of linear negators as a convex combination of Yager and uniform negators.
arXiv Detail & Related papers (2021-03-27T20:24:10Z) - Contextuality scenarios arising from networks of stochastic processes [68.8204255655161]
An empirical model is said contextual if its distributions cannot be obtained marginalizing a joint distribution over X.
We present a different and classical source of contextual empirical models: the interaction among many processes.
The statistical behavior of the network in the long run makes the empirical model generically contextual and even strongly contextual.
arXiv Detail & Related papers (2020-06-22T16:57:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.