Quasicyclic Principal Component Analysis
- URL: http://arxiv.org/abs/2502.05297v1
- Date: Fri, 07 Feb 2025 20:00:36 GMT
- Title: Quasicyclic Principal Component Analysis
- Authors: Susanna E. Rumsey, Stark C. Draper, Frank R. Kschischang,
- Abstract summary: We present quasicyclic principal component analysis (QPCA), a generalization of principal component analysis (PCA)
QPCA determines an optimized basis for a dataset in terms of families of shift-orthogonal principal vectors.
This is of particular interest when analyzing cyclostationary data, whose cyclic structure is not exploited by the standard PCA algorithm.
- Score: 22.011612303891262
- License:
- Abstract: We present quasicyclic principal component analysis (QPCA), a generalization of principal component analysis (PCA), that determines an optimized basis for a dataset in terms of families of shift-orthogonal principal vectors. This is of particular interest when analyzing cyclostationary data, whose cyclic structure is not exploited by the standard PCA algorithm. We first formulate QPCA as an optimization problem, which we show may be decomposed into a series of PCA problems in the frequency domain. We then formalize our solution as an explicit algorithm and analyze its computational complexity. Finally, we provide some examples of applications of QPCA to cyclostationary signal processing data, including an investigation of carrier pulse recovery, a presentation of methods for estimating an unknown oversampling rate, and a discussion of an appropriate approach for pre-processing data with a non-integer oversampling rate in order to better apply the QPCA algorithm.
Related papers
- Learning-Augmented K-Means Clustering Using Dimensional Reduction [1.7243216387069678]
We propose a solution to reduce the dimensionality of the dataset using Principal Component Analysis (PCA)
PCA is well-established in the literature and has become one of the most useful tools for data modeling, compression, and visualization.
arXiv Detail & Related papers (2024-01-06T12:02:33Z) - Support Recovery in Sparse PCA with Non-Random Missing Data [27.3669650952144]
We analyze a practical algorithm for sparse PCA on incomplete and noisy data under a general non-random sampling scheme.
We provide theoretical justification that under certain conditions, we can recover the support of the sparse leading eigenvector with high probability.
We show that our algorithm outperforms several other sparse PCA approaches especially when the observed entries have good structural properties.
arXiv Detail & Related papers (2023-02-03T04:20:25Z) - End-to-end resource analysis for quantum interior point methods and portfolio optimization [63.4863637315163]
We provide a complete quantum circuit-level description of the algorithm from problem input to problem output.
We report the number of logical qubits and the quantity/depth of non-Clifford T-gates needed to run the algorithm.
arXiv Detail & Related papers (2022-11-22T18:54:48Z) - Exploring the Algorithm-Dependent Generalization of AUPRC Optimization
with List Stability [107.65337427333064]
optimization of the Area Under the Precision-Recall Curve (AUPRC) is a crucial problem for machine learning.
In this work, we present the first trial in the single-dependent generalization of AUPRC optimization.
Experiments on three image retrieval datasets on speak to the effectiveness and soundness of our framework.
arXiv Detail & Related papers (2022-09-27T09:06:37Z) - Distributed Robust Principal Analysis [0.0]
We study the robust principal component analysis problem in a distributed setting.
We propose the first distributed robust principal analysis algorithm based on consensus factorization, dubbed DCF-PCA.
arXiv Detail & Related papers (2022-07-24T05:45:07Z) - Quantum Algorithms for Data Representation and Analysis [68.754953879193]
We provide quantum procedures that speed-up the solution of eigenproblems for data representation in machine learning.
The power and practical use of these subroutines is shown through new quantum algorithms, sublinear in the input matrix's size, for principal component analysis, correspondence analysis, and latent semantic analysis.
Results show that the run-time parameters that do not depend on the input's size are reasonable and that the error on the computed model is small, allowing for competitive classification performances.
arXiv Detail & Related papers (2021-04-19T00:41:43Z) - Stochastic Approximation for Online Tensorial Independent Component
Analysis [98.34292831923335]
Independent component analysis (ICA) has been a popular dimension reduction tool in statistical machine learning and signal processing.
In this paper, we present a by-product online tensorial algorithm that estimates for each independent component.
arXiv Detail & Related papers (2020-12-28T18:52:37Z) - Empirical Bayes PCA in high dimensions [11.806200054814772]
Principal Components Analysis is known to exhibit problematic phenomena of high-dimensional noise.
We propose an Empirical Bayes PCA method that reduces this noise by estimating a structural prior for the joint distributions of the principal components.
arXiv Detail & Related papers (2020-12-21T20:43:44Z) - Probabilistic Contrastive Principal Component Analysis [0.5286651840245514]
We propose a model-based alternative to contrastive principal component analysis ( CPCA)
We show PCPCA's advantages over CPCA, including greater interpretability, uncertainty quantification and principled inference.
We demonstrate PCPCA's performance through a series of simulations and case-control experiments with datasets of gene expression, protein expression, and images.
arXiv Detail & Related papers (2020-12-14T22:21:50Z) - Revisiting the Sample Complexity of Sparse Spectrum Approximation of
Gaussian Processes [60.479499225746295]
We introduce a new scalable approximation for Gaussian processes with provable guarantees which hold simultaneously over its entire parameter space.
Our approximation is obtained from an improved sample complexity analysis for sparse spectrum Gaussian processes (SSGPs)
arXiv Detail & Related papers (2020-11-17T05:41:50Z) - Approximation Algorithms for Sparse Principal Component Analysis [57.5357874512594]
Principal component analysis (PCA) is a widely used dimension reduction technique in machine learning and statistics.
Various approaches to obtain sparse principal direction loadings have been proposed, which are termed Sparse Principal Component Analysis.
We present thresholding as a provably accurate, time, approximation algorithm for the SPCA problem.
arXiv Detail & Related papers (2020-06-23T04:25:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.