Reverse Convolution and Its Applications to Image Restoration
- URL: http://arxiv.org/abs/2508.09824v2
- Date: Fri, 15 Aug 2025 01:40:46 GMT
- Title: Reverse Convolution and Its Applications to Image Restoration
- Authors: Xuhong Huang, Shiqi Liu, Kai Zhang, Ying Tai, Jian Yang, Hui Zeng, Lei Zhang,
- Abstract summary: Convolution and transposed convolution are fundamental operators widely used in neural networks.<n>We propose a novel depthwise reverse convolution operator as an initial attempt to effectively reverse depthwise convolution.<n>We further construct a reverse convolution block by combining it with layer normalization, 1$times$1 convolution, and GELU activation, forming a Transformer-like structure.
- Score: 48.038943893721296
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convolution and transposed convolution are fundamental operators widely used in neural networks. However, transposed convolution (a.k.a. deconvolution) does not serve as a true inverse of convolution due to inherent differences in their mathematical formulations. To date, no reverse convolution operator has been established as a standard component in neural architectures. In this paper, we propose a novel depthwise reverse convolution operator as an initial attempt to effectively reverse depthwise convolution by formulating and solving a regularized least-squares optimization problem. We thoroughly investigate its kernel initialization, padding strategies, and other critical aspects to ensure its effective implementation. Building upon this operator, we further construct a reverse convolution block by combining it with layer normalization, 1$\times$1 convolution, and GELU activation, forming a Transformer-like structure. The proposed operator and block can directly replace conventional convolution and transposed convolution layers in existing architectures, leading to the development of ConverseNet. Corresponding to typical image restoration models such as DnCNN, SRResNet and USRNet, we train three variants of ConverseNet for Gaussian denoising, super-resolution and deblurring, respectively. Extensive experiments demonstrate the effectiveness of the proposed reverse convolution operator as a basic building module. We hope this work could pave the way for developing new operators in deep model design and applications.
Related papers
- From CNNs to Shift-Invariant Twin Models Based on Complex Wavelets [7.812210699650151]
We replace the first-layer combination "real-valued convolutions + max pooling"
We claim that CMod and RMax produce comparable outputs when the convolution kernel is band-pass and oriented.
Our approach achieves superior accuracy on ImageNet and CIFAR-10 classification tasks.
arXiv Detail & Related papers (2022-12-01T09:42:55Z) - Improving the Sample-Complexity of Deep Classification Networks with
Invariant Integration [77.99182201815763]
Leveraging prior knowledge on intraclass variance due to transformations is a powerful method to improve the sample complexity of deep neural networks.
We propose a novel monomial selection algorithm based on pruning methods to allow an application to more complex problems.
We demonstrate the improved sample complexity on the Rotated-MNIST, SVHN and CIFAR-10 datasets.
arXiv Detail & Related papers (2022-02-08T16:16:11Z) - OneDConv: Generalized Convolution For Transform-Invariant Representation [76.15687106423859]
We propose a novel generalized one dimension convolutional operator (OneDConv)
It dynamically transforms the convolution kernels based on the input features in a computationally and parametrically efficient manner.
It improves the robustness and generalization of convolution without sacrificing the performance on common images.
arXiv Detail & Related papers (2022-01-15T07:44:44Z) - X-volution: On the unification of convolution and self-attention [52.80459687846842]
We propose a multi-branch elementary module composed of both convolution and self-attention operation.
The proposed X-volution achieves highly competitive visual understanding improvements.
arXiv Detail & Related papers (2021-06-04T04:32:02Z) - Orthogonalizing Convolutional Layers with the Cayley Transform [83.73855414030646]
We propose and evaluate an alternative approach to parameterize convolutional layers that are constrained to be orthogonal.
We show that our method indeed preserves orthogonality to a high degree even for large convolutions.
arXiv Detail & Related papers (2021-04-14T23:54:55Z) - Convolutional Normalization: Improving Deep Convolutional Network
Robustness and Training [44.66478612082257]
Normalization techniques have become a basic component in modern convolutional neural networks (ConvNets)
We introduce a simple and efficient convolutional normalization'' method that can fully exploit the convolutional structure in the Fourier domain.
We show that convolutional normalization can reduce the layerwise spectral norm of the weight matrices and hence improve the Lipschitzness of the network.
arXiv Detail & Related papers (2021-03-01T00:33:04Z) - DO-Conv: Depthwise Over-parameterized Convolutional Layer [66.46704754669169]
We propose to augment a convolutional layer with an additional depthwise convolution, where each input channel is convolved with a different 2D kernel.
We show with extensive experiments that the mere replacement of conventional convolutional layers with DO-Conv layers boosts the performance of CNNs.
arXiv Detail & Related papers (2020-06-22T06:57:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.