A selective review of sufficient dimension reduction for multivariate
response regression
- URL: http://arxiv.org/abs/2202.00876v1
- Date: Wed, 2 Feb 2022 04:53:09 GMT
- Title: A selective review of sufficient dimension reduction for multivariate
response regression
- Authors: Yuexiao Dong, Abdul-Nasah Soale, Michael D. Power
- Abstract summary: A wide range of SDR methods are characterized as inverse regression SDR estimators or forward regression SDR estimators.
Ordinary least squares, partial least squares, and semiparametric SDR estimators are discussed as estimators from the forward regression family.
- Score: 2.492300648514129
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We review sufficient dimension reduction (SDR) estimators with multivariate
response in this paper. A wide range of SDR methods are characterized as
inverse regression SDR estimators or forward regression SDR estimators. The
inverse regression family include pooled marginal estimators, projective
resampling estimators, and distance-based estimators. Ordinary least squares,
partial least squares, and semiparametric SDR estimators, on the other hand,
are discussed as estimators from the forward regression family.
Related papers
- Doubly Robust Regression Discontinuity Designs [10.470114319701576]
We introduce a doubly robust (DR) estimator for regression discontinuity (RD) designs.
Our proposed estimator achieves $sqrtn$-consistency if both regression estimators satisfy certain mild conditions.
arXiv Detail & Related papers (2024-11-12T17:58:34Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Regression-aware Inference with LLMs [52.764328080398805]
We show that an inference strategy can be sub-optimal for common regression and scoring evaluation metrics.
We propose alternate inference strategies that estimate the Bayes-optimal solution for regression and scoring metrics in closed-form from sampled responses.
arXiv Detail & Related papers (2024-03-07T03:24:34Z) - Multifidelity Covariance Estimation via Regression on the Manifold of Symmetric Positive Definite Matrices [0.42855555838080844]
We show that our manifold regression multifidelity (MRMF) covariance estimator is a maximum likelihood estimator under a certain error model on manifold space.
We demonstrate via numerical examples that the MRMF estimator can provide significant decreases, up to one order of magnitude, in squared estimation error.
arXiv Detail & Related papers (2023-07-23T21:46:55Z) - Prediction Risk and Estimation Risk of the Ridgeless Least Squares Estimator under General Assumptions on Regression Errors [10.857775300638831]
We explore prediction risk as well as estimation risk under more general regression error assumptions.
Our findings suggest that the benefits of over parameterization can extend to time series, panel and grouped data.
arXiv Detail & Related papers (2023-05-22T10:04:20Z) - Sufficient Dimension Reduction for High-Dimensional Regression and
Low-Dimensional Embedding: Tutorial and Survey [5.967999555890417]
This is a tutorial and survey paper on various methods for Sufficient Dimension Reduction (SDR)
We cover these methods with both statistical high-dimensional regression perspective and machine learning approach for dimensionality reduction.
arXiv Detail & Related papers (2021-10-18T21:05:08Z) - Online nonparametric regression with Sobolev kernels [99.12817345416846]
We derive the regret upper bounds on the classes of Sobolev spaces $W_pbeta(mathcalX)$, $pgeq 2, beta>fracdp$.
The upper bounds are supported by the minimax regret analysis, which reveals that in the cases $beta> fracd2$ or $p=infty$ these rates are (essentially) optimal.
arXiv Detail & Related papers (2021-02-06T15:05:14Z) - Robust Geodesic Regression [6.827783641211451]
We use M-type estimators, including the $L_1$, Huber and Tukey biweight estimators, to perform robust geodesic regression.
Results from numerical examples, including analysis of real neuroimaging data, demonstrate the promising empirical properties of the proposed approach.
arXiv Detail & Related papers (2020-07-09T02:41:32Z) - Nonparametric Estimation of the Fisher Information and Its Applications [82.00720226775964]
This paper considers the problem of estimation of the Fisher information for location from a random sample of size $n$.
An estimator proposed by Bhattacharya is revisited and improved convergence rates are derived.
A new estimator, termed a clipped estimator, is proposed.
arXiv Detail & Related papers (2020-05-07T17:21:56Z) - Comment: Entropy Learning for Dynamic Treatment Regimes [58.442274475425144]
JSLZ's approach leverages a rejection-and-sampling estimate of the value of a given decision rule based on inverse probability (IPW) and its interpretation as a weighted (or cost-sensitive) classification.
Their use of smooth classification surrogates enables their careful approach to analyzing distributions.
The IPW estimate is problematic as it leads to weights that discard most of the data and are extremely variable on whatever remains.
arXiv Detail & Related papers (2020-04-06T16:11:05Z) - Estimating Gradients for Discrete Random Variables by Sampling without
Replacement [93.09326095997336]
We derive an unbiased estimator for expectations over discrete random variables based on sampling without replacement.
We show that our estimator can be derived as the Rao-Blackwellization of three different estimators.
arXiv Detail & Related papers (2020-02-14T14:15:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.