Belief Revision from Probability
- URL: http://arxiv.org/abs/2307.05632v1
- Date: Tue, 11 Jul 2023 07:11:30 GMT
- Title: Belief Revision from Probability
- Authors: Jeremy Goodman (University of Southern California), Bernhard Salow
(University of Oxford)
- Abstract summary: We develop a question-relative, probabilistic account of belief.
We show that the principles it validates are much weaker than those of orthodox theories of belief revision like AGM.
We conclude by arguing that the present framework compares favorably to the rival probabilistic accounts of belief developed by Leitgeb and by Lin and Kelly.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In previous work ("Knowledge from Probability", TARK 2021) we develop a
question-relative, probabilistic account of belief. On this account, what
someone believes relative to a given question is (i) closed under entailment,
(ii) sufficiently probable given their evidence, and (iii) sensitive to the
relative probabilities of the answers to the question. Here we explore the
implications of this account for the dynamics of belief. We show that the
principles it validates are much weaker than those of orthodox theories of
belief revision like AGM, but still stronger than those valid according to the
popular Lockean theory of belief, which equates belief with high subjective
probability. We then consider a restricted class of models, suitable for many
but not all applications, and identify some further natural principles valid on
this class. We conclude by arguing that the present framework compares
favorably to the rival probabilistic accounts of belief developed by Leitgeb
and by Lin and Kelly.
Related papers
- Machine Learning of the Prime Distribution [49.84018914962972]
We provide a theoretical argument explaining the experimental observations of Yang-Hui He about the learnability of primes.
We also posit that the ErdHos-Kac law would very unlikely be discovered by current machine learning techniques.
arXiv Detail & Related papers (2024-03-19T09:47:54Z) - Advancing Counterfactual Inference through Nonlinear Quantile Regression [77.28323341329461]
We propose a framework for efficient and effective counterfactual inference implemented with neural networks.
The proposed approach enhances the capacity to generalize estimated counterfactual outcomes to unseen data.
Empirical results conducted on multiple datasets offer compelling support for our theoretical assertions.
arXiv Detail & Related papers (2023-06-09T08:30:51Z) - A Belief Model for Conflicting and Uncertain Evidence -- Connecting
Dempster-Shafer Theory and the Topology of Evidence [8.295493796476766]
We propose a new model for measuring degrees of beliefs based on possibly inconsistent, incomplete, and uncertain evidence.
We show that computing degrees of belief with this model is #P-complete in general.
arXiv Detail & Related papers (2023-06-06T09:30:48Z) - Combining Predictions under Uncertainty: The Case of Random Decision
Trees [2.322689362836168]
A common approach to aggregate classification estimates in an ensemble of decision trees is to either use voting or to average the probabilities for each class.
In this paper, we investigate a number of alternative prediction methods.
Our methods are inspired by the theories of probability, belief functions and reliable classification.
arXiv Detail & Related papers (2022-08-15T18:36:57Z) - The intersection probability: betting with probability intervals [7.655239948659381]
We propose the use of the intersection probability, a transform derived originally for belief functions in the framework of the geometric approach to uncertainty.
We outline a possible decision making framework for probability intervals, analogous to the Transferable Belief Model for belief functions.
arXiv Detail & Related papers (2022-01-05T17:35:06Z) - Bayesianism, Conditional Probability and Laplace Law of Succession in
Quantum Mechanics [0.0]
We show that as with the classical probability, all these issues can be resolved affirmatively in the quantum probability.
This implies that the relation between the Bayesian probability and the relative frequency in quantum mechanics is the same as that in the classical probability theory.
arXiv Detail & Related papers (2021-12-16T04:55:10Z) - Knowledge from Probability [0.0]
We investigate predictions concerning knowledge about the future, about laws of nature, and about the values of inexactly measured quantities.
The analysis combines a theory of knowledge and belief formulated in terms of relations of comparative normality with a probabilistic reduction of those relations.
It predicts that only highly probable propositions are believed, and that many widely held principles of belief-revision fail.
arXiv Detail & Related papers (2021-06-22T02:46:22Z) - Don't Just Blame Over-parametrization for Over-confidence: Theoretical
Analysis of Calibration in Binary Classification [58.03725169462616]
We show theoretically that over-parametrization is not the only reason for over-confidence.
We prove that logistic regression is inherently over-confident, in the realizable, under-parametrized setting.
Perhaps surprisingly, we also show that over-confidence is not always the case.
arXiv Detail & Related papers (2021-02-15T21:38:09Z) - On Focal Loss for Class-Posterior Probability Estimation: A Theoretical
Perspective [83.19406301934245]
We first prove that the focal loss is classification-calibrated, i.e., its minimizer surely yields the Bayes-optimal classifier.
We then prove that the focal loss is not strictly proper, i.e., the confidence score of the classifier does not match the true class-posterior probability.
Our proposed transformation significantly improves the accuracy of class-posterior probability estimation.
arXiv Detail & Related papers (2020-11-18T09:36:52Z) - Causal Expectation-Maximisation [70.45873402967297]
We show that causal inference is NP-hard even in models characterised by polytree-shaped graphs.
We introduce the causal EM algorithm to reconstruct the uncertainty about the latent variables from data about categorical manifest variables.
We argue that there appears to be an unnoticed limitation to the trending idea that counterfactual bounds can often be computed without knowledge of the structural equations.
arXiv Detail & Related papers (2020-11-04T10:25:13Z) - A Weaker Faithfulness Assumption based on Triple Interactions [89.59955143854556]
We propose a weaker assumption that we call $2$-adjacency faithfulness.
We propose a sound orientation rule for causal discovery that applies under weaker assumptions.
arXiv Detail & Related papers (2020-10-27T13:04:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.