Joint and Progressive Subspace Analysis (JPSA) with Spatial-Spectral
Manifold Alignment for Semi-Supervised Hyperspectral Dimensionality Reduction
- URL: http://arxiv.org/abs/2009.10003v1
- Date: Mon, 21 Sep 2020 16:29:59 GMT
- Title: Joint and Progressive Subspace Analysis (JPSA) with Spatial-Spectral
Manifold Alignment for Semi-Supervised Hyperspectral Dimensionality Reduction
- Authors: Danfeng Hong, Naoto Yokoya, Jocelyn Chanussot, Jian Xu, Xiao Xiang Zhu
- Abstract summary: We propose a novel technique for hyperspectral subspace analysis.
The technique is called joint and progressive subspace analysis (JPSA)
Experiments are conducted to demonstrate the superiority and effectiveness of the proposed JPSA on two widely-used hyperspectral datasets.
- Score: 48.73525876467408
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conventional nonlinear subspace learning techniques (e.g., manifold learning)
usually introduce some drawbacks in explainability (explicit mapping) and
cost-effectiveness (linearization), generalization capability (out-of-sample),
and representability (spatial-spectral discrimination). To overcome these
shortcomings, a novel linearized subspace analysis technique with
spatial-spectral manifold alignment is developed for a semi-supervised
hyperspectral dimensionality reduction (HDR), called joint and progressive
subspace analysis (JPSA). The JPSA learns a high-level, semantically
meaningful, joint spatial-spectral feature representation from hyperspectral
data by 1) jointly learning latent subspaces and a linear classifier to find an
effective projection direction favorable for classification; 2) progressively
searching several intermediate states of subspaces to approach an optimal
mapping from the original space to a potential more discriminative subspace; 3)
spatially and spectrally aligning manifold structure in each learned latent
subspace in order to preserve the same or similar topological property between
the compressed data and the original data. A simple but effective classifier,
i.e., nearest neighbor (NN), is explored as a potential application for
validating the algorithm performance of different HDR approaches. Extensive
experiments are conducted to demonstrate the superiority and effectiveness of
the proposed JPSA on two widely-used hyperspectral datasets: Indian Pines
(92.98\%) and the University of Houston (86.09\%) in comparison with previous
state-of-the-art HDR methods. The demo of this basic work (i.e., ECCV2018) is
openly available at https://github.com/danfenghong/ECCV2018_J-Play.
Related papers
- Subspace Representation Learning for Sparse Linear Arrays to Localize More Sources than Sensors: A Deep Learning Methodology [19.100476521802243]
We develop a novel methodology that estimates the co-array subspaces from a sample covariance for sparse linear array (SLA)
To learn such representations, we propose loss functions that gauge the separation between the desired and the estimated subspace.
The computation of learning subspaces of different dimensions is accelerated by a new batch sampling strategy.
arXiv Detail & Related papers (2024-08-29T15:14:52Z) - Synergistic eigenanalysis of covariance and Hessian matrices for enhanced binary classification [72.77513633290056]
We present a novel approach that combines the eigenanalysis of a covariance matrix evaluated on a training set with a Hessian matrix evaluated on a deep learning model.
Our method captures intricate patterns and relationships, enhancing classification performance.
arXiv Detail & Related papers (2024-02-14T16:10:42Z) - Decomposed Diffusion Sampler for Accelerating Large-Scale Inverse
Problems [64.29491112653905]
We propose a novel and efficient diffusion sampling strategy that synergistically combines the diffusion sampling and Krylov subspace methods.
Specifically, we prove that if tangent space at a denoised sample by Tweedie's formula forms a Krylov subspace, then the CG with the denoised data ensures the data consistency update to remain in the tangent space.
Our proposed method achieves more than 80 times faster inference time than the previous state-of-the-art method.
arXiv Detail & Related papers (2023-03-10T07:42:49Z) - Linking data separation, visual separation, and classifier performance
using pseudo-labeling by contrastive learning [125.99533416395765]
We argue that the performance of the final classifier depends on the data separation present in the latent space and visual separation present in the projection.
We demonstrate our results by the classification of five real-world challenging image datasets of human intestinal parasites with only 1% supervised samples.
arXiv Detail & Related papers (2023-02-06T10:01:38Z) - Unsupervised Manifold Linearizing and Clustering [19.879641608165887]
We propose to optimize the Maximal Coding Reduction metric with respect to both the data representation and a novel doubly cluster membership.
Experiments on CIFAR-10, -20, -100, and TinyImageNet-200 datasets show that the proposed method is much more accurate and scalable than state-of-the-art deep clustering methods.
arXiv Detail & Related papers (2023-01-04T20:08:23Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Provably Accurate and Scalable Linear Classifiers in Hyperbolic Spaces [39.71927912296049]
We propose a unified framework for learning scalable and simple hyperbolic linear classifiers.
The gist of our approach is to focus on Poincar'e ball models and formulate the classification problems using tangent space formalisms.
The excellent performance of the Poincar'e second-order and strategic perceptrons shows that the proposed framework can be extended to general machine learning problems in hyperbolic spaces.
arXiv Detail & Related papers (2022-03-07T21:36:21Z) - Joint Characterization of Spatiotemporal Data Manifolds [0.0]
Dimensionality reduction (DR) is a type of characterization designed to mitigate the "curse of dimensionality" on high-D signals.
Recent years have seen the additional development of a suite of nonlinear DR algorithms, frequently categorized as "manifold learning"
Here, we show these three DR approaches can yield complementary information about ST manifold topology.
arXiv Detail & Related papers (2021-08-21T16:42:22Z) - Low-Rank Subspaces in GANs [101.48350547067628]
This work introduces low-rank subspaces that enable more precise control of GAN generation.
LowRankGAN is able to find the low-dimensional representation of attribute manifold.
Experiments on state-of-the-art GAN models (including StyleGAN2 and BigGAN) trained on various datasets demonstrate the effectiveness of our LowRankGAN.
arXiv Detail & Related papers (2021-06-08T16:16:32Z) - Fusion of Dual Spatial Information for Hyperspectral Image
Classification [26.304992631350114]
A novel hyperspectral image classification framework using the fusion of dual spatial information is proposed.
Experiments performed on three data sets from different scenes illustrate that the proposed method can outperform other state-of-the-art classification techniques.
arXiv Detail & Related papers (2020-10-23T12:20:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.