Sparse principal component regression via singular value decomposition
approach
- URL: http://arxiv.org/abs/2002.09188v1
- Date: Fri, 21 Feb 2020 09:03:05 GMT
- Title: Sparse principal component regression via singular value decomposition
approach
- Authors: Shuichi Kawano
- Abstract summary: Principal component regression (PCR) is a two-stage procedure.
Since PCA is performed by using only explanatory variables, the principal components have no information about the response variable.
We propose a one-stage procedure forPCR in terms of singular value decomposition approach.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Principal component regression (PCR) is a two-stage procedure: the first
stage performs principal component analysis (PCA) and the second stage
constructs a regression model whose explanatory variables are replaced by
principal components obtained by the first stage. Since PCA is performed by
using only explanatory variables, the principal components have no information
about the response variable. To address the problem, we propose a one-stage
procedure for PCR in terms of singular value decomposition approach. Our
approach is based upon two loss functions, a regression loss and a PCA loss,
with sparse regularization. The proposed method enables us to obtain principal
component loadings that possess information about both explanatory variables
and a response variable. An estimation algorithm is developed by using
alternating direction method of multipliers. We conduct numerical studies to
show the effectiveness of the proposed method.
Related papers
- Principal Component Analysis When n < p: Challenges and Solutions [0.0]
Principal Component Analysis is a key technique for reducing the complexity of high-dimensional data.
Standard principal component analysis performs poorly as a dimensionality reduction technique in high-dimensional scenarios.
We propose a novel estimation called pairwise differences covariance estimation.
arXiv Detail & Related papers (2025-03-21T22:33:52Z) - Adaptive Principal Component Regression with Applications to Panel Data [29.295938927701396]
We provide the first time-uniform finite sample guarantees for (regularized) Principal component regression.
Our results rely on adapting tools from modern martingale concentration to the error-in-variables setting.
We show that our method empirically outperforms a baseline which does not leverage error-in-variables regression.
arXiv Detail & Related papers (2023-07-03T21:13:40Z) - Invertible Kernel PCA with Random Fourier Features [0.22940141855172028]
Kernel principal component analysis (kPCA) is a widely studied method to construct a low-dimensional data representation after a nonlinear transformation.
We present an alternative method where the reconstruction follows naturally from the compression step.
We show that ikPCA performs similarly to kPCA with supervised reconstruction on denoising tasks, making it a strong alternative.
arXiv Detail & Related papers (2023-03-09T05:42:10Z) - Vector-Valued Least-Squares Regression under Output Regularity
Assumptions [73.99064151691597]
We propose and analyse a reduced-rank method for solving least-squares regression problems with infinite dimensional output.
We derive learning bounds for our method, and study under which setting statistical performance is improved in comparison to full-rank method.
arXiv Detail & Related papers (2022-11-16T15:07:00Z) - Local manifold learning and its link to domain-based physics knowledge [53.15471241298841]
In many reacting flow systems, the thermo-chemical state-space is assumed to evolve close to a low-dimensional manifold (LDM)
We show that PCA applied in local clusters of data (local PCA) is capable of detecting the intrinsic parameterization of the thermo-chemical state-space.
arXiv Detail & Related papers (2022-07-01T09:06:25Z) - Dimensionality Reduction and Wasserstein Stability for Kernel Regression [1.3812010983144802]
We study consequences of the naive two-step procedure where first the dimension of the input variables is reduced and second, the reduced input variables are used to predict the output variable with kernel regression.
In order to analyze the resulting regression errors, a novel stability result for kernel regression with respect to the Wasserstein distance is derived.
arXiv Detail & Related papers (2022-03-17T14:26:28Z) - Poseur: Direct Human Pose Regression with Transformers [119.79232258661995]
We propose a direct, regression-based approach to 2D human pose estimation from single images.
Our framework is end-to-end differentiable, and naturally learns to exploit the dependencies between keypoints.
Ours is the first regression-based approach to perform favorably compared to the best heatmap-based pose estimation methods.
arXiv Detail & Related papers (2022-01-19T04:31:57Z) - Sufficient Dimension Reduction for High-Dimensional Regression and
Low-Dimensional Embedding: Tutorial and Survey [5.967999555890417]
This is a tutorial and survey paper on various methods for Sufficient Dimension Reduction (SDR)
We cover these methods with both statistical high-dimensional regression perspective and machine learning approach for dimensionality reduction.
arXiv Detail & Related papers (2021-10-18T21:05:08Z) - AgFlow: Fast Model Selection of Penalized PCA via Implicit
Regularization Effects of Gradient Flow [64.81110234990888]
Principal component analysis (PCA) has been widely used as an effective technique for feature extraction and dimension reduction.
In the High Dimension Low Sample Size (HDLSS) setting, one may prefer modified principal components, with penalized loadings.
We propose Approximated Gradient Flow (AgFlow) as a fast model selection method for penalized PCA.
arXiv Detail & Related papers (2021-10-07T08:57:46Z) - TFPose: Direct Human Pose Estimation with Transformers [83.03424247905869]
We formulate the pose estimation task into a sequence prediction problem that can effectively be solved by transformers.
Our framework is simple and direct, bypassing the drawbacks of the heatmap-based pose estimation.
Experiments on the MS-COCO and MPII datasets demonstrate that our method can significantly improve the state-of-the-art of regression-based pose estimation.
arXiv Detail & Related papers (2021-03-29T04:18:54Z) - Supervised PCA: A Multiobjective Approach [70.99924195791532]
Methods for supervised principal component analysis (SPCA)
We propose a new method for SPCA that addresses both of these objectives jointly.
Our approach accommodates arbitrary supervised learning losses and, through a statistical reformulation, provides a novel low-rank extension of generalized linear models.
arXiv Detail & Related papers (2020-11-10T18:46:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.