Multi-Objective Matrix Normalization for Fine-grained Visual Recognition
- URL: http://arxiv.org/abs/2003.13272v2
- Date: Fri, 10 Apr 2020 07:33:42 GMT
- Title: Multi-Objective Matrix Normalization for Fine-grained Visual Recognition
- Authors: Shaobo Min, Hantao Yao, Hongtao Xie, Zheng-Jun Zha, and Yongdong Zhang
- Abstract summary: Bilinear pooling achieves great success in fine-grained visual recognition (FGVC)
Recent methods have shown that the matrix power normalization can stabilize the second-order information in bilinear features.
We propose an efficient Multi-Objective Matrix Normalization (MOMN) method that can simultaneously normalize a bilinear representation.
- Score: 153.49014114484424
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bilinear pooling achieves great success in fine-grained visual recognition
(FGVC). Recent methods have shown that the matrix power normalization can
stabilize the second-order information in bilinear features, but some problems,
e.g., redundant information and over-fitting, remain to be resolved. In this
paper, we propose an efficient Multi-Objective Matrix Normalization (MOMN)
method that can simultaneously normalize a bilinear representation in terms of
square-root, low-rank, and sparsity. These three regularizers can not only
stabilize the second-order information, but also compact the bilinear features
and promote model generalization. In MOMN, a core challenge is how to jointly
optimize three non-smooth regularizers of different convex properties. To this
end, MOMN first formulates them into an augmented Lagrange formula with
approximated regularizer constraints. Then, auxiliary variables are introduced
to relax different constraints, which allow each regularizer to be solved
alternately. Finally, several updating strategies based on gradient descent are
designed to obtain consistent convergence and efficient implementation.
Consequently, MOMN is implemented with only matrix multiplication, which is
well-compatible with GPU acceleration, and the normalized bilinear features are
stabilized and discriminative. Experiments on five public benchmarks for FGVC
demonstrate that the proposed MOMN is superior to existing normalization-based
methods in terms of both accuracy and efficiency. The code is available:
https://github.com/mboboGO/MOMN.
Related papers
- Global optimization of MPS in quantum-inspired numerical analysis [0.0]
The study focuses on the search for the lowest eigenstates of a Hamiltonian equation.
Five algorithms are introduced: imaginary-time evolution, steepest gradient descent, an improved descent, an implicitly restarted Arnoldi method, and density matrix renormalization group (DMRG) optimization.
arXiv Detail & Related papers (2023-03-16T16:03:51Z) - Asymmetric Scalable Cross-modal Hashing [51.309905690367835]
Cross-modal hashing is a successful method to solve large-scale multimedia retrieval issue.
We propose a novel Asymmetric Scalable Cross-Modal Hashing (ASCMH) to address these issues.
Our ASCMH outperforms the state-of-the-art cross-modal hashing methods in terms of accuracy and efficiency.
arXiv Detail & Related papers (2022-07-26T04:38:47Z) - Semi-Supervised Subspace Clustering via Tensor Low-Rank Representation [64.49871502193477]
We propose a novel semi-supervised subspace clustering method, which is able to simultaneously augment the initial supervisory information and construct a discriminative affinity matrix.
Comprehensive experimental results on six commonly-used benchmark datasets demonstrate the superiority of our method over state-of-the-art methods.
arXiv Detail & Related papers (2022-05-21T01:47:17Z) - Sparse Quadratic Optimisation over the Stiefel Manifold with Application
to Permutation Synchronisation [71.27989298860481]
We address the non- optimisation problem of finding a matrix on the Stiefel manifold that maximises a quadratic objective function.
We propose a simple yet effective sparsity-promoting algorithm for finding the dominant eigenspace matrix.
arXiv Detail & Related papers (2021-09-30T19:17:35Z) - Self-supervised Symmetric Nonnegative Matrix Factorization [82.59905231819685]
Symmetric nonnegative factor matrix (SNMF) has demonstrated to be a powerful method for data clustering.
Inspired by ensemble clustering that aims to seek better clustering results, we propose self-supervised SNMF (S$3$NMF)
We take advantage of the sensitivity to code characteristic of SNMF, without relying on any additional information.
arXiv Detail & Related papers (2021-03-02T12:47:40Z) - Hybrid Trilinear and Bilinear Programming for Aligning Partially
Overlapping Point Sets [85.71360365315128]
In many applications, we need algorithms which can align partially overlapping point sets are invariant to the corresponding corresponding RPM algorithm.
We first show that the objective is a cubic bound function. We then utilize the convex envelopes of trilinear and bilinear monomial transformations to derive its lower bound.
We next develop a branch-and-bound (BnB) algorithm which only branches over the transformation variables and runs efficiently.
arXiv Detail & Related papers (2021-01-19T04:24:23Z) - Accurate Optimization of Weighted Nuclear Norm for Non-Rigid Structure
from Motion [15.641335104467982]
We show that more accurate results can be achieved with 2nd order methods.
Our main result shows how to construct bilinear formulations, for a general class of regularizers.
We show experimentally, on a number of structure from motion problems, that our approach outperforms state-of-the-art methods.
arXiv Detail & Related papers (2020-03-23T13:52:16Z) - A Block Coordinate Descent-based Projected Gradient Algorithm for
Orthogonal Non-negative Matrix Factorization [0.0]
This article utilizes the projected gradient method (PG) for a non-negative matrix factorization problem (NMF)
We penalise the orthonormality constraints and apply the PG method via a block coordinate descent approach.
arXiv Detail & Related papers (2020-03-23T13:24:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.