Robust Geodesic Regression
- URL: http://arxiv.org/abs/2007.04518v3
- Date: Tue, 25 Jan 2022 06:30:40 GMT
- Title: Robust Geodesic Regression
- Authors: Ha-Young Shin and Hee-Seok Oh
- Abstract summary: We use M-type estimators, including the $L_1$, Huber and Tukey biweight estimators, to perform robust geodesic regression.
Results from numerical examples, including analysis of real neuroimaging data, demonstrate the promising empirical properties of the proposed approach.
- Score: 6.827783641211451
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper studies robust regression for data on Riemannian manifolds.
Geodesic regression is the generalization of linear regression to a setting
with a manifold-valued dependent variable and one or more real-valued
independent variables. The existing work on geodesic regression uses the
sum-of-squared errors to find the solution, but as in the classical Euclidean
case, the least-squares method is highly sensitive to outliers. In this paper,
we use M-type estimators, including the $L_1$, Huber and Tukey biweight
estimators, to perform robust geodesic regression, and describe how to
calculate the tuning parameters for the latter two. We also show that, on
compact symmetric spaces, all M-type estimators are maximum likelihood
estimators, and argue for the overall superiority of the $L_1$ estimator over
the $L_2$ and Huber estimators on high-dimensional manifolds and over the Tukey
biweight estimator on compact high-dimensional manifolds. Results from
numerical examples, including analysis of real neuroimaging data, demonstrate
the promising empirical properties of the proposed approach.
Related papers
- Deep Fréchet Regression [4.915744683251151]
We propose a flexible regression model capable of handling high-dimensional predictors without imposing parametric assumptions.
The proposed model outperforms existing methods for non-Euclidean responses.
arXiv Detail & Related papers (2024-07-31T07:54:14Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - Multifidelity Covariance Estimation via Regression on the Manifold of Symmetric Positive Definite Matrices [0.42855555838080844]
We show that our manifold regression multifidelity (MRMF) covariance estimator is a maximum likelihood estimator under a certain error model on manifold space.
We demonstrate via numerical examples that the MRMF estimator can provide significant decreases, up to one order of magnitude, in squared estimation error.
arXiv Detail & Related papers (2023-07-23T21:46:55Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Understanding Augmentation-based Self-Supervised Representation Learning
via RKHS Approximation and Regression [53.15502562048627]
Recent work has built the connection between self-supervised learning and the approximation of the top eigenspace of a graph Laplacian operator.
This work delves into a statistical analysis of augmentation-based pretraining.
arXiv Detail & Related papers (2023-06-01T15:18:55Z) - Non-Asymptotic Guarantees for Robust Statistical Learning under
$(1+\varepsilon)$-th Moment Assumption [0.716879432974126]
This paper proposes a log-truncated M-mestiator for a large family of statistical regressions.
We show the superiority of log-truncated estimations over standard estimations.
arXiv Detail & Related papers (2022-01-10T06:22:30Z) - Online nonparametric regression with Sobolev kernels [99.12817345416846]
We derive the regret upper bounds on the classes of Sobolev spaces $W_pbeta(mathcalX)$, $pgeq 2, beta>fracdp$.
The upper bounds are supported by the minimax regret analysis, which reveals that in the cases $beta> fracd2$ or $p=infty$ these rates are (essentially) optimal.
arXiv Detail & Related papers (2021-02-06T15:05:14Z) - $\gamma$-ABC: Outlier-Robust Approximate Bayesian Computation Based on a
Robust Divergence Estimator [95.71091446753414]
We propose to use a nearest-neighbor-based $gamma$-divergence estimator as a data discrepancy measure.
Our method achieves significantly higher robustness than existing discrepancy measures.
arXiv Detail & Related papers (2020-06-13T06:09:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.