PMaF: Deep Declarative Layers for Principal Matrix Features
- URL: http://arxiv.org/abs/2306.14759v3
- Date: Sat, 1 Jul 2023 05:42:57 GMT
- Title: PMaF: Deep Declarative Layers for Principal Matrix Features
- Authors: Zhiwei Xu, Hao Wang, Yanbin Liu, Stephen Gould
- Abstract summary: We explore two differentiable deep declarative layers, namely least squares on sphere (LESS) and implicit eigen decomposition (IED)
LESS can be used to represent data features with a low-dimensional vector containing dominant information from a high-dimensional matrix.
IED can be used to represent data features with a low-dimensional vector containing dominant information from a high-dimensional matrix.
- Score: 37.662505982849844
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We explore two differentiable deep declarative layers, namely least squares
on sphere (LESS) and implicit eigen decomposition (IED), for learning the
principal matrix features (PMaF). It can be used to represent data features
with a low-dimensional vector containing dominant information from a
high-dimensional matrix. We first solve the problems with iterative
optimization in the forward pass and then backpropagate the solution for
implicit gradients under a bi-level optimization framework. Particularly,
adaptive descent steps with the backtracking line search method and descent
decay in the tangent space are studied to improve the forward pass efficiency
of LESS. Meanwhile, exploited data structures are used to greatly reduce the
computational complexity in the backward pass of LESS and IED. Empirically, we
demonstrate the superiority of our layers over the off-the-shelf baselines by
comparing the solution optimality and computational requirements.
Related papers
- Dimension reduction via score ratio matching [0.9012198585960441]
We propose a framework, derived from score-matching, to extend gradient-based dimension reduction to problems where gradients are unavailable.
We show that our approach outperforms standard score-matching for problems with low-dimensional structure.
arXiv Detail & Related papers (2024-10-25T22:21:03Z) - Differentially Private Optimization with Sparse Gradients [60.853074897282625]
We study differentially private (DP) optimization problems under sparsity of individual gradients.
Building on this, we obtain pure- and approximate-DP algorithms with almost optimal rates for convex optimization with sparse gradients.
arXiv Detail & Related papers (2024-04-16T20:01:10Z) - LancBiO: dynamic Lanczos-aided bilevel optimization via Krylov subspace [4.917399520581689]
In this paper, we construct a sequence of low-dimensional approximate Krylov subspaces with the aid of the Lanczos process.
The constructed subspace is able to dynamically and incrementally approximate the Hessian inverse vector product.
We also propose aprovable subspace-based framework for bilevel problems where one central step is to solve a small-size tridiagonal linear system.
arXiv Detail & Related papers (2024-04-04T09:57:29Z) - Mode-wise Principal Subspace Pursuit and Matrix Spiked Covariance Model [13.082805815235975]
We introduce a novel framework called Mode-wise Principal Subspace Pursuit (MOP-UP) to extract hidden variations in both the row and column dimensions for matrix data.
The effectiveness and practical merits of the proposed framework are demonstrated through experiments on both simulated and real datasets.
arXiv Detail & Related papers (2023-07-02T13:59:47Z) - Sufficient dimension reduction for feature matrices [3.04585143845864]
We propose a method called principal support matrix machine (PSMM) for the matrix sufficient dimension reduction.
Our numerical analysis demonstrates that the PSMM outperforms existing methods and has strong interpretability in real data applications.
arXiv Detail & Related papers (2023-03-07T23:16:46Z) - Laplacian-based Cluster-Contractive t-SNE for High Dimensional Data
Visualization [20.43471678277403]
We propose LaptSNE, a new graph-based dimensionality reduction method based on t-SNE.
Specifically, LaptSNE leverages the eigenvalue information of the graph Laplacian to shrink the potential clusters in the low-dimensional embedding.
We show how to calculate the gradient analytically, which may be of broad interest when considering optimization with Laplacian-composited objective.
arXiv Detail & Related papers (2022-07-25T14:10:24Z) - Intermediate Layer Optimization for Inverse Problems using Deep
Generative Models [86.29330440222199]
ILO is a novel optimization algorithm for solving inverse problems with deep generative models.
We empirically show that our approach outperforms state-of-the-art methods introduced in StyleGAN-2 and PULSE for a wide range of inverse problems.
arXiv Detail & Related papers (2021-02-15T06:52:22Z) - Cogradient Descent for Bilinear Optimization [124.45816011848096]
We introduce a Cogradient Descent algorithm (CoGD) to address the bilinear problem.
We solve one variable by considering its coupling relationship with the other, leading to a synchronous gradient descent.
Our algorithm is applied to solve problems with one variable under the sparsity constraint.
arXiv Detail & Related papers (2020-06-16T13:41:54Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - Two-Dimensional Semi-Nonnegative Matrix Factorization for Clustering [50.43424130281065]
We propose a new Semi-Nonnegative Matrix Factorization method for 2-dimensional (2D) data, named TS-NMF.
It overcomes the drawback of existing methods that seriously damage the spatial information of the data by converting 2D data to vectors in a preprocessing step.
arXiv Detail & Related papers (2020-05-19T05:54:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.