Regularized Linear Discriminant Analysis Using a Nonlinear Covariance
Matrix Estimator
- URL: http://arxiv.org/abs/2401.17760v2
- Date: Wed, 7 Feb 2024 06:49:41 GMT
- Title: Regularized Linear Discriminant Analysis Using a Nonlinear Covariance
Matrix Estimator
- Authors: Maaz Mahadi, Tarig Ballal, Muhammad Moinuddin, Tareq Y. Al-Naffouri,
and Ubaid M. Al-Saggaf
- Abstract summary: Linear discriminant analysis (LDA) is a widely used technique for data classification.
LDA becomes inefficient when the data covariance matrix is ill-conditioned.
Regularized LDA methods have been proposed to cope with such a situation.
- Score: 11.887333567383239
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Linear discriminant analysis (LDA) is a widely used technique for data
classification. The method offers adequate performance in many classification
problems, but it becomes inefficient when the data covariance matrix is
ill-conditioned. This often occurs when the feature space's dimensionality is
higher than or comparable to the training data size. Regularized LDA (RLDA)
methods based on regularized linear estimators of the data covariance matrix
have been proposed to cope with such a situation. The performance of RLDA
methods is well studied, with optimal regularization schemes already proposed.
In this paper, we investigate the capability of a positive semidefinite
ridge-type estimator of the inverse covariance matrix that coincides with a
nonlinear (NL) covariance matrix estimator. The estimator is derived by
reformulating the score function of the optimal classifier utilizing linear
estimation methods, which eventually results in the proposed NL-RLDA
classifier. We derive asymptotic and consistent estimators of the proposed
technique's misclassification rate under the assumptions of a double-asymptotic
regime and multivariate Gaussian model for the classes. The consistent
estimator, coupled with a one-dimensional grid search, is used to set the value
of the regularization parameter required for the proposed NL-RLDA classifier.
Performance evaluations based on both synthetic and real data demonstrate the
effectiveness of the proposed classifier. The proposed technique outperforms
state-of-art methods over multiple datasets. When compared to state-of-the-art
methods across various datasets, the proposed technique exhibits superior
performance.
Related papers
- Synergistic eigenanalysis of covariance and Hessian matrices for enhanced binary classification [72.77513633290056]
We present a novel approach that combines the eigenanalysis of a covariance matrix evaluated on a training set with a Hessian matrix evaluated on a deep learning model.
Our method captures intricate patterns and relationships, enhancing classification performance.
arXiv Detail & Related papers (2024-02-14T16:10:42Z) - Spectrally-Corrected and Regularized Linear Discriminant Analysis for
Spiked Covariance Model [2.517838307493912]
This paper proposes an improved linear discriminant analysis called spectrally-corrected and regularized LDA (SRLDA)
It is proved that SRLDA has a linear classification global optimal solution under the spiked model assumption.
Experiments on different data sets show that the SRLDA algorithm performs better in classification and dimensionality reduction than currently used tools.
arXiv Detail & Related papers (2022-10-08T00:47:50Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - Varying Coefficient Linear Discriminant Analysis for Dynamic Data [5.228711636020666]
This paper investigates the varying coefficient LDA model for dynamic data.
By deriving a new discriminant direction function parallel with Bayes' direction, we propose a least-square estimation procedure.
For high-dimensional regime, the corresponding data-driven discriminant rule is more computationally efficient than the existed dynamic linear programming rule.
arXiv Detail & Related papers (2022-03-12T07:32:19Z) - Weight Vector Tuning and Asymptotic Analysis of Binary Linear
Classifiers [82.5915112474988]
This paper proposes weight vector tuning of a generic binary linear classifier through the parameterization of a decomposition of the discriminant by a scalar.
It is also found that weight vector tuning significantly improves the performance of Linear Discriminant Analysis (LDA) under high estimation noise.
arXiv Detail & Related papers (2021-10-01T17:50:46Z) - Self-Weighted Robust LDA for Multiclass Classification with Edge Classes [111.5515086563592]
A novel self-weighted robust LDA with l21-norm based between-class distance criterion, called SWRLDA, is proposed for multi-class classification.
The proposed SWRLDA is easy to implement, and converges fast in practice.
arXiv Detail & Related papers (2020-09-24T12:32:55Z) - High-Dimensional Quadratic Discriminant Analysis under Spiked Covariance
Model [101.74172837046382]
We propose a novel quadratic classification technique, the parameters of which are chosen such that the fisher-discriminant ratio is maximized.
Numerical simulations show that the proposed classifier not only outperforms the classical R-QDA for both synthetic and real data but also requires lower computational complexity.
arXiv Detail & Related papers (2020-06-25T12:00:26Z) - A Doubly Regularized Linear Discriminant Analysis Classifier with
Automatic Parameter Selection [24.027886914804775]
Linear discriminant analysis (LDA) based classifiers tend to falter in many practical settings where the training data size is smaller than, or comparable to, the number of features.
We propose a doubly regularized LDA classifier that we denote as R2LDA.
Results obtained from both synthetic and real data demonstrate the consistency and effectiveness of the proposed R2LDA approach.
arXiv Detail & Related papers (2020-04-28T07:09:22Z) - Asymptotic Analysis of an Ensemble of Randomly Projected Linear
Discriminants [94.46276668068327]
In [1], an ensemble of randomly projected linear discriminants is used to classify datasets.
We develop a consistent estimator of the misclassification probability as an alternative to the computationally-costly cross-validation estimator.
We also demonstrate the use of our estimator for tuning the projection dimension on both real and synthetic data.
arXiv Detail & Related papers (2020-04-17T12:47:04Z) - Covariance Estimation for Matrix-valued Data [9.739753590548796]
We propose a class of distribution-free regularized covariance estimation methods for high-dimensional matrix data.
We formulate a unified framework for estimating bandable covariance, and introduce an efficient algorithm based on rank one unconstrained Kronecker product approximation.
We demonstrate the superior finite-sample performance of our methods using simulations and real applications from a gridded temperature anomalies dataset and a S&P 500 stock data analysis.
arXiv Detail & Related papers (2020-04-11T02:15:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.