Model Free Prediction with Uncertainty Assessment
- URL: http://arxiv.org/abs/2405.12684v4
- Date: Wed, 31 Jul 2024 07:27:56 GMT
- Title: Model Free Prediction with Uncertainty Assessment
- Authors: Yuling Jiao, Lican Kang, Jin Liu, Heng Peng, Heng Zuo,
- Abstract summary: We propose a novel framework that transforms the deep estimation paradigm into a platform conducive to conditional mean estimation.
We develop an end-to-end convergence rate for the conditional diffusion model and establish the normality of the generated samples.
Through numerical experiments, we empirically validate the efficacy of our proposed methodology.
- Score: 7.524024486998338
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep nonparametric regression, characterized by the utilization of deep neural networks to learn target functions, has emerged as a focus of research attention in recent years. Despite considerable progress in understanding convergence rates, the absence of asymptotic properties hinders rigorous statistical inference. To address this gap, we propose a novel framework that transforms the deep estimation paradigm into a platform conducive to conditional mean estimation, leveraging the conditional diffusion model. Theoretically, we develop an end-to-end convergence rate for the conditional diffusion model and establish the asymptotic normality of the generated samples. Consequently, we are equipped to construct confidence regions, facilitating robust statistical inference. Furthermore, through numerical experiments, we empirically validate the efficacy of our proposed methodology.
Related papers
- A Likelihood Based Approach to Distribution Regression Using Conditional Deep Generative Models [6.647819824559201]
We study the large-sample properties of a likelihood-based approach for estimating conditional deep generative models.
Our results lead to the convergence rate of a sieve maximum likelihood estimator for estimating the conditional distribution.
arXiv Detail & Related papers (2024-10-02T20:46:21Z) - Zero-Shot Uncertainty Quantification using Diffusion Probabilistic Models [7.136205674624813]
We conduct a study to evaluate the effectiveness of ensemble methods on solving different regression problems using diffusion models.
We demonstrate that ensemble methods consistently improve model prediction accuracy across various regression tasks.
Our study provides a comprehensive view of the utility of diffusion ensembles, serving as a useful reference for practitioners employing diffusion models in regression problem-solving.
arXiv Detail & Related papers (2024-08-08T18:34:52Z) - Extended Fiducial Inference: Toward an Automated Process of Statistical Inference [9.277340234795801]
We develop a new statistical inference method called extended Fiducial inference (EFI)
The new method achieves the goal of fiducial inference by leveraging advanced statistical computing techniques.
EFI offers significant advantages in parameter estimation and hypothesis testing.
arXiv Detail & Related papers (2024-07-31T14:15:42Z) - Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - Semi-Parametric Inference for Doubly Stochastic Spatial Point Processes: An Approximate Penalized Poisson Likelihood Approach [3.085995273374333]
Doubly-stochastic point processes model the occurrence of events over a spatial domain as an inhomogeneous process conditioned on the realization of a random intensity function.
Existing implementations of doubly-stochastic spatial models are computationally demanding, often have limited theoretical guarantee, and/or rely on restrictive assumptions.
arXiv Detail & Related papers (2023-06-11T19:48:39Z) - Advancing Counterfactual Inference through Nonlinear Quantile Regression [77.28323341329461]
We propose a framework for efficient and effective counterfactual inference implemented with neural networks.
The proposed approach enhances the capacity to generalize estimated counterfactual outcomes to unseen data.
Empirical results conducted on multiple datasets offer compelling support for our theoretical assertions.
arXiv Detail & Related papers (2023-06-09T08:30:51Z) - The Implicit Delta Method [61.36121543728134]
In this paper, we propose an alternative, the implicit delta method, which works by infinitesimally regularizing the training loss of uncertainty.
We show that the change in the evaluation due to regularization is consistent for the variance of the evaluation estimator, even when the infinitesimal change is approximated by a finite difference.
arXiv Detail & Related papers (2022-11-11T19:34:17Z) - Conditional-Flow NeRF: Accurate 3D Modelling with Reliable Uncertainty
Quantification [44.598503284186336]
Conditional-Flow NeRF (CF-NeRF) is a novel probabilistic framework to incorporate uncertainty quantification into NeRF-based approaches.
CF-NeRF learns a distribution over all possible radiance fields modelling which is used to quantify the uncertainty associated with the modelled scene.
arXiv Detail & Related papers (2022-03-18T23:26:20Z) - Divergence Frontiers for Generative Models: Sample Complexity,
Quantization Level, and Frontier Integral [58.434753643798224]
Divergence frontiers have been proposed as an evaluation framework for generative models.
We establish non-asymptotic bounds on the sample complexity of the plug-in estimator of divergence frontiers.
We also augment the divergence frontier framework by investigating the statistical performance of smoothed distribution estimators.
arXiv Detail & Related papers (2021-06-15T06:26:25Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.