Hyperspectral Unmixing via Nonnegative Matrix Factorization with
Handcrafted and Learnt Priors
- URL: http://arxiv.org/abs/2010.04611v1
- Date: Fri, 9 Oct 2020 14:40:20 GMT
- Title: Hyperspectral Unmixing via Nonnegative Matrix Factorization with
Handcrafted and Learnt Priors
- Authors: Min Zhao, Tiande Gao, Jie Chen, Wei Chen
- Abstract summary: We propose an NMF based unmixing framework which jointly uses a handcrafting regularizer and a learnt regularizer from data.
We plug learnt priors of abundances where the associated subproblem can be addressed using various image denoisers.
- Score: 14.032039261229853
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nowadays, nonnegative matrix factorization (NMF) based methods have been
widely applied to blind spectral unmixing. Introducing proper regularizers to
NMF is crucial for mathematically constraining the solutions and physically
exploiting spectral and spatial properties of images. Generally, properly
handcrafting regularizers and solving the associated complex optimization
problem are non-trivial tasks. In our work, we propose an NMF based unmixing
framework which jointly uses a handcrafting regularizer and a learnt
regularizer from data. we plug learnt priors of abundances where the associated
subproblem can be addressed using various image denoisers, and we consider an
l_2,1-norm regularizer to the abundance matrix to promote sparse unmixing
results. The proposed framework is flexible and extendable. Both synthetic data
and real airborne data are conducted to confirm the effectiveness of our
method.
Related papers
- Large-scale gradient-based training of Mixtures of Factor Analyzers [67.21722742907981]
This article contributes both a theoretical analysis as well as a new method for efficient high-dimensional training by gradient descent.
We prove that MFA training and inference/sampling can be performed based on precision matrices, which does not require matrix inversions after training is completed.
Besides the theoretical analysis and matrices, we apply MFA to typical image datasets such as SVHN and MNIST, and demonstrate the ability to perform sample generation and outlier detection.
arXiv Detail & Related papers (2023-08-26T06:12:33Z) - Nonlinear Hyperspectral Unmixing based on Multilinear Mixing Model using
Convolutional Autoencoders [6.867229549627128]
We propose a novel autoencoder-based network for unsupervised unmixing based on reflection.
Experiments on both the synthetic and real datasets demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2023-03-14T18:11:52Z) - Quadratic Matrix Factorization with Applications to Manifold Learning [1.6795461001108094]
We propose a quadratic matrix factorization (QMF) framework to learn the curved manifold on which the dataset lies.
Algorithmically, we propose an alternating minimization algorithm to optimize QMF and establish its theoretical convergence properties.
Experiments on a synthetic manifold learning dataset and two real datasets, including the MNIST handwritten dataset and a cryogenic electron microscopy dataset, demonstrate the superiority of the proposed method over its competitors.
arXiv Detail & Related papers (2023-01-30T15:09:00Z) - A Novel Maximum-Entropy-Driven Technique for Low-Rank Orthogonal
Nonnegative Matrix Factorization with $\ell_0$-Norm sparsity Constraint [0.0]
In data-driven control and machine learning, a common requirement involves breaking down large matrices into smaller, low-rank factors.
This paper introduces an innovative solution to the orthogonal nonnegative matrix factorization (ONMF) problem.
The proposed method achieves comparable or improved reconstruction errors in line with the literature.
arXiv Detail & Related papers (2022-10-06T04:30:59Z) - Unitary Approximate Message Passing for Matrix Factorization [90.84906091118084]
We consider matrix factorization (MF) with certain constraints, which finds wide applications in various areas.
We develop a Bayesian approach to MF with an efficient message passing implementation, called UAMPMF.
We show that UAMPMF significantly outperforms state-of-the-art algorithms in terms of recovery accuracy, robustness and computational complexity.
arXiv Detail & Related papers (2022-07-31T12:09:32Z) - Adaptive Weighted Nonnegative Matrix Factorization for Robust Feature
Representation [9.844796520630522]
Nonnegative matrix factorization (NMF) has been widely used to dimensionality reduction in machine learning.
Traditional NMF does not properly handle outliers, so that it is sensitive to noise.
This paper proposes an adaptive weighted NMF, which introduces weights to emphasize the different importance of each data point.
arXiv Detail & Related papers (2022-06-07T05:27:08Z) - Log-based Sparse Nonnegative Matrix Factorization for Data
Representation [55.72494900138061]
Nonnegative matrix factorization (NMF) has been widely studied in recent years due to its effectiveness in representing nonnegative data with parts-based representations.
We propose a new NMF method with log-norm imposed on the factor matrices to enhance the sparseness.
A novel column-wisely sparse norm, named $ell_2,log$-(pseudo) norm, is proposed to enhance the robustness of the proposed method.
arXiv Detail & Related papers (2022-04-22T11:38:10Z) - Robust Matrix Factorization with Grouping Effect [28.35582493230616]
We propose a novel method called Matrix Factorization with Grouping effect (GRMF)
The proposed GRMF can learn grouping structure and sparsity in MF without prior knowledge.
Experiments have been conducted using real-world data sets with outliers and contaminated noise.
arXiv Detail & Related papers (2021-06-25T15:03:52Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - Feature Weighted Non-negative Matrix Factorization [92.45013716097753]
We propose the Feature weighted Non-negative Matrix Factorization (FNMF) in this paper.
FNMF learns the weights of features adaptively according to their importances.
It can be solved efficiently with the suggested optimization algorithm.
arXiv Detail & Related papers (2021-03-24T21:17:17Z) - Self-supervised Symmetric Nonnegative Matrix Factorization [82.59905231819685]
Symmetric nonnegative factor matrix (SNMF) has demonstrated to be a powerful method for data clustering.
Inspired by ensemble clustering that aims to seek better clustering results, we propose self-supervised SNMF (S$3$NMF)
We take advantage of the sensitivity to code characteristic of SNMF, without relying on any additional information.
arXiv Detail & Related papers (2021-03-02T12:47:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.