Adaptive and Implicit Regularization for Matrix Completion
- URL: http://arxiv.org/abs/2208.05640v1
- Date: Thu, 11 Aug 2022 05:00:58 GMT
- Title: Adaptive and Implicit Regularization for Matrix Completion
- Authors: Zhemin Li, Tao Sun, Hongxia Wang, Bao Wang
- Abstract summary: This paper proposes a new adaptive and implicit low-rank regularization that captures the low-rank prior dynamically from the training data.
We show that the adaptive regularization of ReTwoAIR enhances the implicit regularization and vanishes at the end of training.
We validate AIR's effectiveness on various benchmark tasks, indicating that the AIR is particularly favorable for the scenarios when the missing entries are non-uniform.
- Score: 17.96984956202579
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The explicit low-rank regularization, e.g., nuclear norm regularization, has
been widely used in imaging sciences. However, it has been found that implicit
regularization outperforms explicit ones in various image processing tasks.
Another issue is that the fixed explicit regularization limits the
applicability to broad images since different images favor different features
captured by different explicit regularizations. As such, this paper proposes a
new adaptive and implicit low-rank regularization that captures the low-rank
prior dynamically from the training data. The core of our new adaptive and
implicit low-rank regularization is parameterizing the Laplacian matrix in the
Dirichlet energy-based regularization, which we call the regularization AIR.
Theoretically, we show that the adaptive regularization of \ReTwo{AIR} enhances
the implicit regularization and vanishes at the end of training. We validate
AIR's effectiveness on various benchmark tasks, indicating that the AIR is
particularly favorable for the scenarios when the missing entries are
non-uniform. The code can be found at https://github.com/lizhemin15/AIR-Net.
Related papers
- Sparse is Enough in Fine-tuning Pre-trained Large Language Models [98.46493578509039]
We propose a gradient-based sparse fine-tuning algorithm, named Sparse Increment Fine-Tuning (SIFT)
We validate its effectiveness on a range of tasks including the GLUE Benchmark and Instruction-tuning.
arXiv Detail & Related papers (2023-12-19T06:06:30Z) - The Implicit Bias of Batch Normalization in Linear Models and Two-layer
Linear Convolutional Neural Networks [117.93273337740442]
We show that gradient descent converges to a uniform margin classifier on the training data with an $exp(-Omega(log2 t))$ convergence rate.
We also show that batch normalization has an implicit bias towards a patch-wise uniform margin.
arXiv Detail & Related papers (2023-06-20T16:58:00Z) - Zero-Shot Anomaly Detection via Batch Normalization [58.291409630995744]
Anomaly detection plays a crucial role in many safety-critical application domains.
The challenge of adapting an anomaly detector to drift in the normal data distribution has led to the development of zero-shot AD techniques.
We propose a simple yet effective method called Adaptive Centered Representations (ACR) for zero-shot batch-level AD.
arXiv Detail & Related papers (2023-02-15T18:34:15Z) - AltUB: Alternating Training Method to Update Base Distribution of
Normalizing Flow for Anomaly Detection [1.3999481573773072]
Unsupervised anomaly detection is coming into the spotlight these days in various practical domains.
One of the major approaches for it is a normalizing flow which pursues the invertible transformation of a complex distribution as images into an easy distribution as N(0, I)
arXiv Detail & Related papers (2022-10-26T16:31:15Z) - Breaking Time Invariance: Assorted-Time Normalization for RNNs [5.229616140749998]
We propose a normalization method called Assorted-Time Normalization (ATN)
ATN preserves information from multiple consecutive time steps and normalizes using them.
Our experiments applying ATN to LN demonstrate consistent improvement on various tasks.
arXiv Detail & Related papers (2022-09-28T21:51:13Z) - AIR-Net: Adaptive and Implicit Regularization Neural Network for Matrix
Completion [6.846820212701818]
This work combines adaptive and implicit low-rank regularization that captures the prior dynamically according to the current recovered matrix.
Theoretical analyses show that the adaptive part of the AIR-Net enhances implicit regularization.
With complete flexibility to select neural networks for matrix representation, AIR-Net can be extended to solve more general inverse problems.
arXiv Detail & Related papers (2021-10-12T04:13:33Z) - Distribution Mismatch Correction for Improved Robustness in Deep Neural
Networks [86.42889611784855]
normalization methods increase the vulnerability with respect to noise and input corruptions.
We propose an unsupervised non-parametric distribution correction method that adapts the activation distribution of each layer.
In our experiments, we empirically show that the proposed method effectively reduces the impact of intense image corruptions.
arXiv Detail & Related papers (2021-10-05T11:36:25Z) - Efficient Semantic Image Synthesis via Class-Adaptive Normalization [116.63715955932174]
Class-adaptive normalization (CLADE) is a lightweight but equally-effective variant that is only adaptive to semantic class.
We introduce intra-class positional map encoding calculated from semantic layouts to modulate the normalization parameters of CLADE.
The proposed CLADE can be generalized to different SPADE-based methods while achieving comparable generation quality compared to SPADE.
arXiv Detail & Related papers (2020-12-08T18:59:32Z) - A Scalable, Adaptive and Sound Nonconvex Regularizer for Low-rank Matrix
Completion [60.52730146391456]
We propose a new non scalable low-rank regularizer called "nuclear Frobenius norm" regularizer, which is adaptive and sound.
It bypasses the computation of singular values and allows fast optimization by algorithms.
It obtains state-of-the-art recovery performance while being the fastest in existing matrix learning methods.
arXiv Detail & Related papers (2020-08-14T18:47:58Z) - Adaptive L2 Regularization in Person Re-Identification [0.9195729979000402]
We introduce an adaptive L2 regularization mechanism in the setting of person re-identification.
Experiments on the Market-1501, DukeMTMC-reID and MSMT17 datasets validate the effectiveness of our framework.
arXiv Detail & Related papers (2020-07-15T17:50:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.