Tuning-free ridge estimators for high-dimensional generalized linear
models
- URL: http://arxiv.org/abs/2002.11916v1
- Date: Thu, 27 Feb 2020 05:01:42 GMT
- Title: Tuning-free ridge estimators for high-dimensional generalized linear
models
- Authors: Shih-Ting Huang, Fang Xie, and Johannes Lederer
- Abstract summary: We show that ridge estimators can be modified such that tuning parameters can be avoided altogether.
We also show that these modified versions can improve on the empirical prediction accuracies of standard ridge estimators combined with cross-validation.
- Score: 3.383670923637875
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Ridge estimators regularize the squared Euclidean lengths of parameters. Such
estimators are mathematically and computationally attractive but involve tuning
parameters that can be difficult to calibrate. In this paper, we show that
ridge estimators can be modified such that tuning parameters can be avoided
altogether. We also show that these modified versions can improve on the
empirical prediction accuracies of standard ridge estimators combined with
cross-validation, and we provide first theoretical guarantees.
Related papers
- Parametric Scaling Law of Tuning Bias in Conformal Prediction [11.970092440023956]
We find that the tuning bias - the coverage gap introduced by leveraging the same dataset for tuning and calibration - is negligible for simple parameter tuning in many conformal prediction methods.
We establish a theoretical framework to quantify the tuning bias and provide rigorous proof for the scaling law of the tuning bias by deriving its upper bound.
arXiv Detail & Related papers (2025-02-05T09:26:47Z) - Optimizing Estimators of Squared Calibration Errors in Classification [2.3020018305241337]
We propose a mean-squared error-based risk that enables the comparison and optimization of estimators of squared calibration errors.
Our approach advocates for a training-validation-testing pipeline when estimating a calibration error.
arXiv Detail & Related papers (2024-10-09T15:58:06Z) - Probabilistic Parameter Estimators and Calibration Metrics for Pose Estimation from Image Features [30.85393323542915]
This paper addresses the challenge of probabilistic parameter estimation given measurement uncertainty in real-time.
We present three probabilistic parameter estimators: a least-squares sampling approach, a linear approximation method, and a probabilistic programming estimator.
We demonstrate that the linear approximation estimator can produce sharp and well-calibrated pose predictions significantly faster than the other methods but may yield overconfident predictions in certain scenarios.
arXiv Detail & Related papers (2024-07-23T07:02:01Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Scaling Exponents Across Parameterizations and Optimizers [94.54718325264218]
We propose a new perspective on parameterization by investigating a key assumption in prior work.
Our empirical investigation includes tens of thousands of models trained with all combinations of threes.
We find that the best learning rate scaling prescription would often have been excluded by the assumptions in prior work.
arXiv Detail & Related papers (2024-07-08T12:32:51Z) - Calibration by Distribution Matching: Trainable Kernel Calibration
Metrics [56.629245030893685]
We introduce kernel-based calibration metrics that unify and generalize popular forms of calibration for both classification and regression.
These metrics admit differentiable sample estimates, making it easy to incorporate a calibration objective into empirical risk minimization.
We provide intuitive mechanisms to tailor calibration metrics to a decision task, and enforce accurate loss estimation and no regret decisions.
arXiv Detail & Related papers (2023-10-31T06:19:40Z) - Sharp Calibrated Gaussian Processes [58.94710279601622]
State-of-the-art approaches for designing calibrated models rely on inflating the Gaussian process posterior variance.
We present a calibration approach that generates predictive quantiles using a computation inspired by the vanilla Gaussian process posterior variance.
Our approach is shown to yield a calibrated model under reasonable assumptions.
arXiv Detail & Related papers (2023-02-23T12:17:36Z) - Tuned Regularized Estimators for Linear Regression via Covariance
Fitting [17.46329281993348]
We consider the problem of finding tuned regularized parameter estimators for linear models.
We show that three known optimal linear estimators belong to a wider class of estimators.
We show that the resulting class of estimators yields tuned versions of known regularized estimators.
arXiv Detail & Related papers (2022-01-21T16:08:08Z) - Nonparametric Score Estimators [49.42469547970041]
Estimating the score from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models.
We provide a unifying view of these estimators under the framework of regularized nonparametric regression.
We propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.
arXiv Detail & Related papers (2020-05-20T15:01:03Z) - SUMO: Unbiased Estimation of Log Marginal Probability for Latent
Variable Models [80.22609163316459]
We introduce an unbiased estimator of the log marginal likelihood and its gradients for latent variable models based on randomized truncation of infinite series.
We show that models trained using our estimator give better test-set likelihoods than a standard importance-sampling based approach for the same average computational cost.
arXiv Detail & Related papers (2020-04-01T11:49:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.