Statistical inference as Green's functions
- URL: http://arxiv.org/abs/2205.11366v1
- Date: Mon, 23 May 2022 14:51:32 GMT
- Title: Statistical inference as Green's functions
- Authors: Hyun Keun Lee, Chulan Kwon, and Yong Woon Kim
- Abstract summary: We show that statistical inference has rigorous scientific description for long sequence of exchangeable binary random variables.
Our finding is the answer to the normative and foundational issue in science, and its significance will be far-reaching in all pure and applied fields.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Statistical inference from data is foundational task in science. Recently, it
receives growing attention for its central role in inference systems of primary
interest in data science, artificial intelligence, or machine learning.
However, the understanding of statistical inference itself is not that solid
while regarded as a matter of subjective choice or implemented in obscure ways.
We here show that statistical inference has rigorous scientific description for
long sequence of exchangeable binary random variables, the prototypal
stochasticity in theories and applications. A linear differential equation is
derived from the exchangeability, and it turns out that statistical inference
is given by the Green's functions. Our finding is the answer to the normative
and foundational issue in science, and its significance will be far-reaching in
all pure and applied fields.
Related papers
- An Overview of Causal Inference using Kernel Embeddings [14.298666697532838]
Kernel embeddings have emerged as a powerful tool for representing probability measures in a variety of statistical inference problems.
Main challenges include identifying causal associations and estimating the average treatment effect from observational data.
arXiv Detail & Related papers (2024-10-30T07:23:34Z) - Estimation of mutual information via quantum kernel method [0.0]
Estimating mutual information (MI) plays a critical role to investigate the relationship among multiple random variables with a nonlinear correlation.
We propose a method for estimating mutual information using the quantum kernel.
arXiv Detail & Related papers (2023-10-19T00:53:16Z) - A Causal Framework for Decomposing Spurious Variations [68.12191782657437]
We develop tools for decomposing spurious variations in Markovian and Semi-Markovian models.
We prove the first results that allow a non-parametric decomposition of spurious effects.
The described approach has several applications, ranging from explainable and fair AI to questions in epidemiology and medicine.
arXiv Detail & Related papers (2023-06-08T09:40:28Z) - On the Joint Interaction of Models, Data, and Features [82.60073661644435]
We introduce a new tool, the interaction tensor, for empirically analyzing the interaction between data and model through features.
Based on these observations, we propose a conceptual framework for feature learning.
Under this framework, the expected accuracy for a single hypothesis and agreement for a pair of hypotheses can both be derived in closed-form.
arXiv Detail & Related papers (2023-06-07T21:35:26Z) - Inferential Moments of Uncertain Multivariable Systems [0.0]
We treat Bayesian probability updating as a random process and uncover intrinsic quantitative features of joint probability distributions called inferential moments.
Inferential moments quantify shape information about how a prior distribution is expected to update in response to yet to be obtained information.
We find a power series expansion of the mutual information in terms of inferential moments, which implies a connection between inferential theoretic logic and elements of information theory.
arXiv Detail & Related papers (2023-05-03T00:56:12Z) - Prediction-Powered Inference [68.97619568620709]
Prediction-powered inference is a framework for performing valid statistical inference when an experimental dataset is supplemented with predictions from a machine-learning system.
The framework yields simple algorithms for computing provably valid confidence intervals for quantities such as means, quantiles, and linear and logistic regression coefficients.
Prediction-powered inference could enable researchers to draw valid and more data-efficient conclusions using machine learning.
arXiv Detail & Related papers (2023-01-23T18:59:28Z) - BayesIMP: Uncertainty Quantification for Causal Data Fusion [52.184885680729224]
We study the causal data fusion problem, where datasets pertaining to multiple causal graphs are combined to estimate the average treatment effect of a target variable.
We introduce a framework which combines ideas from probabilistic integration and kernel mean embeddings to represent interventional distributions in the reproducing kernel Hilbert space.
arXiv Detail & Related papers (2021-06-07T10:14:18Z) - Counterfactual Invariance to Spurious Correlations: Why and How to Pass
Stress Tests [87.60900567941428]
A spurious correlation' is the dependence of a model on some aspect of the input data that an analyst thinks shouldn't matter.
In machine learning, these have a know-it-when-you-see-it character.
We study stress testing using the tools of causal inference.
arXiv Detail & Related papers (2021-05-31T14:39:38Z) - A Philosophy of Data [91.3755431537592]
We work from the fundamental properties necessary for statistical computation to a definition of statistical data.
We argue that the need for useful data to be commensurable rules out an understanding of properties as fundamentally unique or equal.
With our increasing reliance on data and data technologies, these two characteristics of data affect our collective conception of reality.
arXiv Detail & Related papers (2020-04-15T14:47:24Z) - Causal Relational Learning [29.082088734252213]
We propose a declarative language called CaRL for capturing causal background knowledge and assumptions.
CaRL provides a foundation for inferring causality and reasoning about the effect of complex interventions in relational domains.
arXiv Detail & Related papers (2020-04-07T18:33:05Z) - On Geometry of Information Flow for Causal Inference [0.0]
This paper takes the perspective of information flow, which includes the Nobel prize winning work on Granger-causality.
Our main contribution will be to develop analysis tools that will allow a geometric interpretation of information flow as a causal inference indicated by transfer entropy.
arXiv Detail & Related papers (2020-02-06T02:46:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.