Randomized based restricted kernel machine for hyperspectral image classification
- URL: http://arxiv.org/abs/2503.05837v1
- Date: Thu, 06 Mar 2025 17:18:39 GMT
- Title: Randomized based restricted kernel machine for hyperspectral image classification
- Authors: A. Quadir, M. Tanveer,
- Abstract summary: Random vector functional link (RVFL) network has gained significant popularity in hyperspectral image (HSI) classification.<n> RVFL models face several limitations, particularly in handling non-linear relationships and complex data structures.<n>We propose a novel randomized based restricted kernel machine ($R2KM$) model that combines the strehyperngths of RVFL and restricted kernel machines.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, the random vector functional link (RVFL) network has gained significant popularity in hyperspectral image (HSI) classification due to its simplicity, speed, and strong generalization performance. However, despite these advantages, RVFL models face several limitations, particularly in handling non-linear relationships and complex data structures. The random initialization of input-to-hidden weights can lead to instability, and the model struggles with determining the optimal number of hidden nodes, affecting its performance on more challenging datasets. To address these issues, we propose a novel randomized based restricted kernel machine ($R^2KM$) model that combines the strehyperngths of RVFL and restricted kernel machines (RKM). $R^2KM$ introduces a layered structure that represents kernel methods using both visible and hidden variables, analogous to the energy function in restricted Boltzmann machines (RBM). This structure enables $R^2KM$ to capture complex data interactions and non-linear relationships more effectively, improving both interpretability and model robustness. A key contribution of $R^2KM$ is the introduction of a novel conjugate feature duality based on the Fenchel-Young inequality, which expresses the problem in terms of conjugate dual variables and provides an upper bound on the objective function. This duality enhances the model's flexibility and scalability, offering a more efficient and flexible solution for complex data analysis tasks. Extensive experiments on hyperspectral image datasets and real-world data from the UCI and KEEL repositories show that $R^2KM$ outperforms baseline models, demonstrating its effectiveness in classification and regression tasks.
Related papers
- TRKM: Twin Restricted Kernel Machines for Classification and Regression [0.0]
TRKM combines the benefits of twin models with the robustness of the RKM framework to enhance classification and regression tasks.<n>We implement the TRKM model on the brain age dataset, demonstrating its efficacy in predicting brain age.
arXiv Detail & Related papers (2025-02-13T05:13:46Z) - The Risk of Federated Learning to Skew Fine-Tuning Features and
Underperform Out-of-Distribution Robustness [50.52507648690234]
Federated learning has the risk of skewing fine-tuning features and compromising the robustness of the model.
We introduce three robustness indicators and conduct experiments across diverse robust datasets.
Our approach markedly enhances the robustness across diverse scenarios, encompassing various parameter-efficient fine-tuning methods.
arXiv Detail & Related papers (2024-01-25T09:18:51Z) - Higher Order Gauge Equivariant CNNs on Riemannian Manifolds and
Applications [7.322121417864824]
We introduce a higher order generalization of the gauge equivariant convolution, dubbed a gauge equivariant Volterra network (GEVNet)
This allows us to model spatially extended nonlinear interactions within a given field while still maintaining equivariance to global isometries.
In the neuroimaging data experiments, the resulting two-part architecture is used to automatically discriminate between patients with Lewy Body Disease (DLB), Alzheimer's Disease (AD) and Parkinson's Disease (PD) from diffusion magnetic resonance images (dMRI)
arXiv Detail & Related papers (2023-05-26T06:02:31Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - DA-VEGAN: Differentiably Augmenting VAE-GAN for microstructure
reconstruction from extremely small data sets [110.60233593474796]
DA-VEGAN is a model with two central innovations.
A $beta$-variational autoencoder is incorporated into a hybrid GAN architecture.
A custom differentiable data augmentation scheme is developed specifically for this architecture.
arXiv Detail & Related papers (2023-02-17T08:49:09Z) - Beyond the Universal Law of Robustness: Sharper Laws for Random Features
and Neural Tangent Kernels [14.186776881154127]
This paper focuses on empirical risk minimization in two settings, namely, random features and the neural tangent kernel (NTK)
We prove that, for random features, the model is not robust for any degree of over- parameterization, even when the necessary condition coming from the universal law of robustness is satisfied.
Our results are corroborated by numerical evidence on both synthetic and standard prototypical datasets.
arXiv Detail & Related papers (2023-02-03T09:58:31Z) - FeDXL: Provable Federated Learning for Deep X-Risk Optimization [105.17383135458897]
We tackle a novel federated learning (FL) problem for optimizing a family of X-risks, to which no existing algorithms are applicable.
The challenges for designing an FL algorithm for X-risks lie in the non-decomability of the objective over multiple machines and the interdependency between different machines.
arXiv Detail & Related papers (2022-10-26T00:23:36Z) - Tensor-based Multi-view Spectral Clustering via Shared Latent Space [14.470859959783995]
Multi-view Spectral Clustering (MvSC) attracts increasing attention due to diverse data sources.
New method for MvSC is proposed via a shared latent space from the Restricted Kernel Machine framework.
arXiv Detail & Related papers (2022-07-23T17:30:54Z) - Brain Image Synthesis with Unsupervised Multivariate Canonical
CSC$\ell_4$Net [122.8907826672382]
We propose to learn dedicated features that cross both intre- and intra-modal variations using a novel CSC$ell_4$Net.
arXiv Detail & Related papers (2021-03-22T05:19:40Z) - Bayesian Sparse Factor Analysis with Kernelized Observations [67.60224656603823]
Multi-view problems can be faced with latent variable models.
High-dimensionality and non-linear issues are traditionally handled by kernel methods.
We propose merging both approaches into single model.
arXiv Detail & Related papers (2020-06-01T14:25:38Z) - Robust Generative Restricted Kernel Machines using Weighted Conjugate
Feature Duality [11.68800227521015]
We introduce weighted conjugate feature duality in the framework of Restricted Kernel Machines (RKMs)
The RKM formulation allows for an easy integration of methods from classical robust statistics.
Experiments show that the weighted RKM is capable of generating clean images when contamination is present in the training data.
arXiv Detail & Related papers (2020-02-04T09:23:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.