AdderSR: Towards Energy Efficient Image Super-Resolution
- URL: http://arxiv.org/abs/2009.08891v7
- Date: Tue, 4 May 2021 08:01:51 GMT
- Title: AdderSR: Towards Energy Efficient Image Super-Resolution
- Authors: Dehua Song, Yunhe Wang, Hanting Chen, Chang Xu, Chunjing Xu, Dacheng
Tao
- Abstract summary: This paper studies the single image super-resolution problem using adder neural networks (AdderNet)
Compared with convolutional neural networks, AdderNet utilizing additions to calculate the output features thus avoid massive energy consumptions of conventional multiplications.
- Score: 127.61437479490047
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper studies the single image super-resolution problem using adder
neural networks (AdderNet). Compared with convolutional neural networks,
AdderNet utilizing additions to calculate the output features thus avoid
massive energy consumptions of conventional multiplications. However, it is
very hard to directly inherit the existing success of AdderNet on large-scale
image classification to the image super-resolution task due to the different
calculation paradigm. Specifically, the adder operation cannot easily learn the
identity mapping, which is essential for image processing tasks. In addition,
the functionality of high-pass filters cannot be ensured by AdderNet. To this
end, we thoroughly analyze the relationship between an adder operation and the
identity mapping and insert shortcuts to enhance the performance of SR models
using adder networks. Then, we develop a learnable power activation for
adjusting the feature distribution and refining details. Experiments conducted
on several benchmark models and datasets demonstrate that, our image
super-resolution models using AdderNet can achieve comparable performance and
visual quality to that of their CNN baselines with an about 2$\times$ reduction
on the energy consumption.
Related papers
- Adaptive Convolutional Neural Network for Image Super-resolution [43.06377001247278]
We propose a adaptive convolutional neural network for image super-resolution (ADSRNet)
The upper network can enhance relation of context information, salient information relation of a kernel mapping and relations of shallow and deep layers.
The lower network utilizes a symmetric architecture to enhance relations of different layers to mine more structural information.
arXiv Detail & Related papers (2024-02-24T03:44:06Z) - HAT: Hybrid Attention Transformer for Image Restoration [61.74223315807691]
Transformer-based methods have shown impressive performance in image restoration tasks, such as image super-resolution and denoising.
We propose a new Hybrid Attention Transformer (HAT) to activate more input pixels for better restoration.
Our HAT achieves state-of-the-art performance both quantitatively and qualitatively.
arXiv Detail & Related papers (2023-09-11T05:17:55Z) - An Empirical Study of Adder Neural Networks for Object Detection [67.64041181937624]
Adder neural networks (AdderNets) have shown impressive performance on image classification with only addition operations.
We present an empirical study of AdderNets for object detection.
arXiv Detail & Related papers (2021-12-27T11:03:13Z) - Adder Neural Networks [75.54239599016535]
We present adder networks (AdderNets) to trade massive multiplications in deep neural networks.
In AdderNets, we take the $ell_p$-norm distance between filters and input feature as the output response.
We show that the proposed AdderNets can achieve 75.7% Top-1 accuracy 92.3% Top-5 accuracy using ResNet-50 on the ImageNet dataset.
arXiv Detail & Related papers (2021-05-29T04:02:51Z) - Scalable Visual Transformers with Hierarchical Pooling [61.05787583247392]
We propose a Hierarchical Visual Transformer (HVT) which progressively pools visual tokens to shrink the sequence length.
It brings a great benefit by scaling dimensions of depth/width/resolution/patch size without introducing extra computational complexity.
Our HVT outperforms the competitive baselines on ImageNet and CIFAR-100 datasets.
arXiv Detail & Related papers (2021-03-19T03:55:58Z) - AdderNet and its Minimalist Hardware Design for Energy-Efficient
Artificial Intelligence [111.09105910265154]
We present a novel minimalist hardware architecture using adder convolutional neural network (AdderNet)
The whole AdderNet can practically achieve 16% enhancement in speed.
We conclude the AdderNet is able to surpass all the other competitors.
arXiv Detail & Related papers (2021-01-25T11:31:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.