Random Manifold Sampling and Joint Sparse Regularization for Multi-label
Feature Selection
- URL: http://arxiv.org/abs/2204.06445v3
- Date: Thu, 30 Mar 2023 13:00:41 GMT
- Title: Random Manifold Sampling and Joint Sparse Regularization for Multi-label
Feature Selection
- Authors: Haibao Li and Hongzhi Zhai
- Abstract summary: The model proposed in this paper can obtain the most relevant few features by solving the joint constrained optimization problems of $ell_2,1$ and $ell_F$ regularization.
Comparative experiments on real-world data sets show that the proposed method outperforms other methods.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-label learning is usually used to mine the correlation between features
and labels, and feature selection can retain as much information as possible
through a small number of features. $\ell_{2,1}$ regularization method can get
sparse coefficient matrix, but it can not solve multicollinearity problem
effectively. The model proposed in this paper can obtain the most relevant few
features by solving the joint constrained optimization problems of $\ell_{2,1}$
and $\ell_{F}$ regularization.In manifold regularization, we implement random
walk strategy based on joint information matrix, and get a highly robust
neighborhood graph.In addition, we given the algorithm for solving the model
and proved its convergence.Comparative experiments on real-world data sets show
that the proposed method outperforms other methods.
Related papers
- Embedded Multi-label Feature Selection via Orthogonal Regression [45.55795914923279]
State-of-the-art embedded multi-label feature selection algorithms based on at least square regression cannot preserve sufficient discriminative information in multi-label data.
A novel embedded multi-label feature selection method is proposed to facilitate the multi-label feature selection.
Extensive experimental results on ten multi-label data sets demonstrate the effectiveness of GRROOR.
arXiv Detail & Related papers (2024-03-01T06:18:40Z) - A model-free feature selection technique of feature screening and random
forest based recursive feature elimination [0.0]
We propose a model-free feature selection method for ultra-high dimensional data with mass features.
We show that the proposed method is selection consistent and $L$ consistent under weak regularity conditions.
arXiv Detail & Related papers (2023-02-15T03:39:16Z) - A distribution-free mixed-integer optimization approach to hierarchical modelling of clustered and longitudinal data [0.0]
We introduce an innovative algorithm that evaluates cluster effects for new data points, thereby increasing the robustness and precision of this model.
The inferential and predictive efficacy of this approach is further illustrated through its application in student scoring and protein expression.
arXiv Detail & Related papers (2023-02-06T23:34:51Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Multi-block-Single-probe Variance Reduced Estimator for Coupled
Compositional Optimization [49.58290066287418]
We propose a novel method named Multi-block-probe Variance Reduced (MSVR) to alleviate the complexity of compositional problems.
Our results improve upon prior ones in several aspects, including the order of sample complexities and dependence on strongity.
arXiv Detail & Related papers (2022-07-18T12:03:26Z) - Sparse Quadratic Optimisation over the Stiefel Manifold with Application
to Permutation Synchronisation [71.27989298860481]
We address the non- optimisation problem of finding a matrix on the Stiefel manifold that maximises a quadratic objective function.
We propose a simple yet effective sparsity-promoting algorithm for finding the dominant eigenspace matrix.
arXiv Detail & Related papers (2021-09-30T19:17:35Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - Sparse PCA via $l_{2,p}$-Norm Regularization for Unsupervised Feature
Selection [138.97647716793333]
We propose a simple and efficient unsupervised feature selection method, by combining reconstruction error with $l_2,p$-norm regularization.
We present an efficient optimization algorithm to solve the proposed unsupervised model, and analyse the convergence and computational complexity of the algorithm theoretically.
arXiv Detail & Related papers (2020-12-29T04:08:38Z) - On the Adversarial Robustness of LASSO Based Feature Selection [72.54211869067979]
In the considered model, there is a malicious adversary who can observe the whole dataset, and then will carefully modify the response values or the feature matrix.
We formulate the modification strategy of the adversary as a bi-level optimization problem.
Numerical examples with synthetic and real data illustrate that our method is efficient and effective.
arXiv Detail & Related papers (2020-10-20T05:51:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.