Statistical Inference after Kernel Ridge Regression Imputation under
item nonresponse
- URL: http://arxiv.org/abs/2102.00058v1
- Date: Fri, 29 Jan 2021 20:46:33 GMT
- Title: Statistical Inference after Kernel Ridge Regression Imputation under
item nonresponse
- Authors: Hengfang Wang, Jae-Kwang Kim
- Abstract summary: We consider a nonparametric approach to imputation using the kernel ridge regression technique and propose consistent variance estimation.
The proposed variance estimator is based on a linearization approach which employs the entropy method to estimate the density ratio.
- Score: 0.76146285961466
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Imputation is a popular technique for handling missing data. We consider a
nonparametric approach to imputation using the kernel ridge regression
technique and propose consistent variance estimation. The proposed variance
estimator is based on a linearization approach which employs the entropy method
to estimate the density ratio. The root-n consistency of the imputation
estimator is established when a Sobolev space is utilized in the kernel ridge
regression imputation, which enables us to develop the proposed variance
estimator. Synthetic data experiments are presented to confirm our theory.
Related papers
- Progression: an extrapolation principle for regression [0.0]
We propose a novel statistical extrapolation principle.
It assumes a simple relationship between predictors and the response at the boundary of the training predictor samples.
Our semi-parametric method, progression, leverages this extrapolation principle and offers guarantees on the approximation error beyond the training data range.
arXiv Detail & Related papers (2024-10-30T17:29:51Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Collaborative Heterogeneous Causal Inference Beyond Meta-analysis [68.4474531911361]
We propose a collaborative inverse propensity score estimator for causal inference with heterogeneous data.
Our method shows significant improvements over the methods based on meta-analysis when heterogeneity increases.
arXiv Detail & Related papers (2024-04-24T09:04:36Z) - Selective Nonparametric Regression via Testing [54.20569354303575]
We develop an abstention procedure via testing the hypothesis on the value of the conditional variance at a given point.
Unlike existing methods, the proposed one allows to account not only for the value of the variance itself but also for the uncertainty of the corresponding variance predictor.
arXiv Detail & Related papers (2023-09-28T13:04:11Z) - Positive definite nonparametric regression using an evolutionary
algorithm with application to covariance function estimation [0.0]
We propose a novel nonparametric regression framework for estimating covariance functions of stationary processes.
Our method can impose positive definiteness, as well as isotropy and monotonicity, on the estimators.
Our method provides more reliable estimates for long-range dependence.
arXiv Detail & Related papers (2023-04-25T22:01:14Z) - Density Estimation with Autoregressive Bayesian Predictives [1.5771347525430772]
In the context of density estimation, the standard Bayesian approach is to target the posterior predictive.
We develop a novel parameterization of the bandwidth using an autoregressive neural network that maps the data into a latent space.
arXiv Detail & Related papers (2022-06-13T20:43:39Z) - Heavy-tailed Streaming Statistical Estimation [58.70341336199497]
We consider the task of heavy-tailed statistical estimation given streaming $p$ samples.
We design a clipped gradient descent and provide an improved analysis under a more nuanced condition on the noise of gradients.
arXiv Detail & Related papers (2021-08-25T21:30:27Z) - Statistical inference using Regularized M-estimation in the reproducing
kernel Hilbert space for handling missing data [0.76146285961466]
We first use the kernel ridge regression to develop imputation for handling item nonresponse.
A nonparametric propensity score estimator using the kernel Hilbert space is also developed.
The proposed method is applied to analyze the air pollution data measured in Beijing, China.
arXiv Detail & Related papers (2021-07-15T14:51:39Z) - Statistical Inference for High-Dimensional Linear Regression with
Blockwise Missing Data [13.48481978963297]
Blockwise missing data occurs when we integrate multisource or multimodality data where different sources or modalities contain complementary information.
We propose a computationally efficient estimator for the regression coefficient vector based on carefully constructed unbiased estimating equations.
Numerical studies and application analysis of the Alzheimer's Disease Neuroimaging Initiative data show that the proposed method performs better and benefits more from unsupervised samples than existing methods.
arXiv Detail & Related papers (2021-06-07T05:12:42Z) - Nonparametric Score Estimators [49.42469547970041]
Estimating the score from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models.
We provide a unifying view of these estimators under the framework of regularized nonparametric regression.
We propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.
arXiv Detail & Related papers (2020-05-20T15:01:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.