Belief Evolution Network: Probability Transformation of Basic Belief
Assignment and Fusion Conflict Probability
- URL: http://arxiv.org/abs/2110.03468v1
- Date: Thu, 7 Oct 2021 13:48:36 GMT
- Title: Belief Evolution Network: Probability Transformation of Basic Belief
Assignment and Fusion Conflict Probability
- Authors: Qianli Zhou, Yusheng Huang, Yong Deng
- Abstract summary: We give a new interpretation of basic belief assignment transformation into probability distribution.
We also use directed acyclic network called belief evolution network to describe the causality between the focal elements of a BBA.
- Score: 4.286327408435937
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We give a new interpretation of basic belief assignment transformation into
probability distribution, and use directed acyclic network called belief
evolution network to describe the causality between the focal elements of a
BBA. On this basis, a new probability transformations method called full
causality probability transformation is proposed, and this method is superior
to all previous method after verification from the process and the result. In
addition, using this method combined with disjunctive combination rule, we
propose a new probabilistic combination rule called disjunctive transformation
combination rule. It has an excellent ability to merge conflicts and an
interesting pseudo-Matthew effect, which offer a new idea to information fusion
besides the combination rule of Dempster.
Related papers
- Probabilistic Coherence Transformation [0.0]
We investigate the probabilistic coherence transformation under strictly incoherent operations.
It is found that the large coherence gain can be realized with the price of success probability loss.
As an application, it is shown that the conversion from coherence into entanglement may benefit from probabilistic coherence transformation.
arXiv Detail & Related papers (2024-10-10T16:37:46Z) - Isopignistic Canonical Decomposition via Belief Evolution Network [12.459136964317942]
We propose an isopignistic transformation based on the belief evolution network.
This decomposition offers a reverse path between the possibility distribution and its isopignistic mass functions.
This paper establishes a theoretical basis for building general models of artificial intelligence based on probability theory, Dempster-Shafer theory, and possibility theory.
arXiv Detail & Related papers (2024-05-04T12:39:15Z) - Collaborative Heterogeneous Causal Inference Beyond Meta-analysis [68.4474531911361]
We propose a collaborative inverse propensity score estimator for causal inference with heterogeneous data.
Our method shows significant improvements over the methods based on meta-analysis when heterogeneity increases.
arXiv Detail & Related papers (2024-04-24T09:04:36Z) - Uncertainty Quantification via Stable Distribution Propagation [60.065272548502]
We propose a new approach for propagating stable probability distributions through neural networks.
Our method is based on local linearization, which we show to be an optimal approximation in terms of total variation distance for the ReLU non-linearity.
arXiv Detail & Related papers (2024-02-13T09:40:19Z) - Semantic Equivariant Mixup [54.734054770032934]
Mixup is a well-established data augmentation technique, which can extend the training distribution and regularize the neural networks.
Previous mixup variants tend to over-focus on the label-related information.
We propose a semantic equivariant mixup (sem) to preserve richer semantic information in the input.
arXiv Detail & Related papers (2023-08-12T03:05:53Z) - A Rigorous Link between Deep Ensembles and (Variational) Bayesian
Methods [14.845158804951552]
We establish the first mathematically rigorous link between Bayesian, variational Bayesian, and ensemble methods.
On a technical level, our contribution amounts to a generalised variational inference through the lense of Wasserstein flows.
arXiv Detail & Related papers (2023-05-24T11:13:59Z) - Federated Learning as Variational Inference: A Scalable Expectation
Propagation Approach [66.9033666087719]
This paper extends the inference view and describes a variational inference formulation of federated learning.
We apply FedEP on standard federated learning benchmarks and find that it outperforms strong baselines in terms of both convergence speed and accuracy.
arXiv Detail & Related papers (2023-02-08T17:58:11Z) - Principled Paraphrase Generation with Parallel Corpora [52.78059089341062]
We formalize the implicit similarity function induced by round-trip Machine Translation.
We show that it is susceptible to non-paraphrase pairs sharing a single ambiguous translation.
We design an alternative similarity metric that mitigates this issue.
arXiv Detail & Related papers (2022-05-24T17:22:42Z) - The intersection probability: betting with probability intervals [7.655239948659381]
We propose the use of the intersection probability, a transform derived originally for belief functions in the framework of the geometric approach to uncertainty.
We outline a possible decision making framework for probability intervals, analogous to the Transferable Belief Model for belief functions.
arXiv Detail & Related papers (2022-01-05T17:35:06Z) - Optimal Change-Point Detection with Training Sequences in the Large and
Moderate Deviations Regimes [72.68201611113673]
This paper investigates a novel offline change-point detection problem from an information-theoretic perspective.
We assume that the knowledge of the underlying pre- and post-change distributions are not known and can only be learned from the training sequences which are available.
arXiv Detail & Related papers (2020-03-13T23:39:40Z) - A generalization of the symmetrical and optimal
probability-to-possibility transformations [0.0]
This paper studies the advantages and shortcomings of two well-known discrete probability to possibility transformations.
It generalizes them and alleviates their shortcomings, showing a big potential for practical application.
The paper also introduces a novel fuzzy measure of specificity for probability distributions based on the concept of fuzzy subsethood.
arXiv Detail & Related papers (2019-12-29T17:43:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.