One-Sided Box Filter for Edge Preserving Image Smoothing
- URL: http://arxiv.org/abs/2108.05021v1
- Date: Wed, 11 Aug 2021 04:22:38 GMT
- Title: One-Sided Box Filter for Edge Preserving Image Smoothing
- Authors: Yuanhao Gong
- Abstract summary: We present a one-sided box filter that can smooth the signal but keep the discontinuous features in the signal.
More specifically, we perform box filter on eight one-sided windows, leading to a one-sided box filter that can preserve corners and edges.
- Score: 9.67565473617028
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Image smoothing is a fundamental task in signal processing. For such task,
box filter is well-known. However, box filter can not keep some features of the
signal, such as edges, corners and the jump in the step function. In this
paper, we present a one-sided box filter that can smooth the signal but keep
the discontinuous features in the signal. More specifically, we perform box
filter on eight one-sided windows, leading to a one-sided box filter that can
preserve corners and edges. Our filter inherits the constant $O(1)$
computational complexity of the original box filter with respect to the window
size and also the linear $O(N)$ computational complexity with respect to the
total number of samples. We performance several experiments to show the
efficiency and effectiveness of this filter. We further compare our filter with
other the-state-of-the-art edge preserving methods. Our filter can be deployed
in a large range of applications where the classical box filter is adopted.
Related papers
- An Uncertainty Principle for Linear Recurrent Neural Networks [54.13281679205581]
We build a linear filter of order $S$ that approximates the filter that looks $K$ time steps in the past.
We fully characterize the problem by providing lower bounds of approximation, as well as explicit filters that achieve this lower bound up to constants.
The optimal performance highlights an uncertainty principle: the filter has to average values around the $K$-th time step in the past with a range(width) that is proportional to $K/S$.
arXiv Detail & Related papers (2025-02-13T13:01:46Z) - Pruning Networks with Cross-Layer Ranking & k-Reciprocal Nearest Filters [151.2423480789271]
A novel pruning method, termed CLR-RNF, is proposed for filter-level network pruning.
We conduct image classification on CIFAR-10 and ImageNet to demonstrate the superiority of our CLR-RNF over the state-of-the-arts.
arXiv Detail & Related papers (2022-02-15T04:53:24Z) - Image Edge Restoring Filter [4.060948640328565]
We propose the image Edge Restoring Filter (ERF) to restore the blur edge pixels in the output of local smoothing filters to be clear.
The proposed filter can been implemented after many local smoothing filter.
arXiv Detail & Related papers (2021-12-27T07:02:01Z) - Reverse image filtering using total derivative approximation and
accelerated gradient descent [82.93345261434943]
We address a new problem of reversing the effect of an image filter, which can be linear or nonlinear.
The assumption is that the algorithm of the filter is unknown and the filter is available as a black box.
We formulate this inverse problem as minimizing a local patch-based cost function and use total derivative to approximate the gradient which is used in gradient descent to solve the problem.
arXiv Detail & Related papers (2021-12-08T05:16:11Z) - Learning Versatile Convolution Filters for Efficient Visual Recognition [125.34595948003745]
This paper introduces versatile filters to construct efficient convolutional neural networks.
We conduct theoretical analysis on network complexity and an efficient convolution scheme is introduced.
Experimental results on benchmark datasets and neural networks demonstrate that our versatile filters are able to achieve comparable accuracy as that of original filters.
arXiv Detail & Related papers (2021-09-20T06:07:14Z) - Unsharp Mask Guided Filtering [53.14430987860308]
The goal of this paper is guided image filtering, which emphasizes the importance of structure transfer during filtering.
We propose a new and simplified formulation of the guided filter inspired by unsharp masking.
Our formulation enjoys a filtering prior to a low-pass filter and enables explicit structure transfer by estimating a single coefficient.
arXiv Detail & Related papers (2021-06-02T19:15:34Z) - Quarter Laplacian Filter for Edge Aware Image Processing [32.885698849515045]
This paper presents a quarter Laplacian filter that can preserve corners and edges during image smoothing.
We show its edge preserving property in several image processing tasks, including image smoothing, texture enhancement, and low-light image enhancement.
arXiv Detail & Related papers (2021-01-20T02:29:54Z) - Pruning Filter in Filter [38.6403556260338]
Pruning has become a very powerful and effective technique to compress and accelerate modern neural networks.
We propose to prune the filter in the filter to achieve finer granularity than traditional filter pruning methods.
We demonstrate that SWP is more effective compared to the previous FP-based methods and achieves the state-of-art pruning ratio on CIFAR-10 and ImageNet datasets.
arXiv Detail & Related papers (2020-09-30T03:35:16Z) - Filter Grafting for Deep Neural Networks: Reason, Method, and
Cultivation [86.91324735966766]
Filter is the key component in modern convolutional neural networks (CNNs)
In this paper, we introduce filter grafting (textbfMethod) to achieve this goal.
We develop a novel criterion to measure the information of filters and an adaptive weighting strategy to balance the grafted information among networks.
arXiv Detail & Related papers (2020-04-26T08:36:26Z) - Convolutional Neural Network Pruning Using Filter Attenuation [10.282782377635106]
Filters are essential elements in convolutional neural networks (CNNs)
In filter pruning methods, a filter with all of its components, including channels and connections, are removed.
We propose a CNN pruning method based on filter attenuation in which weak filters are not directly removed.
arXiv Detail & Related papers (2020-02-09T06:31:24Z) - Filter Grafting for Deep Neural Networks [71.39169475500324]
Filter grafting aims to improve the representation capability of Deep Neural Networks (DNNs)
We develop an entropy-based criterion to measure the information of filters and an adaptive weighting strategy for balancing the grafted information among networks.
For example, the grafted MobileNetV2 outperforms the non-grafted MobileNetV2 by about 7 percent on CIFAR-100 dataset.
arXiv Detail & Related papers (2020-01-15T03:18:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.