Image Inpainting by Multiscale Spline Interpolation
- URL: http://arxiv.org/abs/2001.03270v1
- Date: Fri, 10 Jan 2020 01:15:14 GMT
- Title: Image Inpainting by Multiscale Spline Interpolation
- Authors: Ghazale Ghorbanzade, Zahra Nabizadeh, Nader Karimi, Shadrokh Samavi
- Abstract summary: We propose a multi-scale image inpainting method that utilizes both local and global features.
On average, we achieved 1.2 dB improvement over some existing inpainting approaches.
- Score: 9.561123408923489
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recovering the missing regions of an image is a task that is called image
inpainting. Depending on the shape of missing areas, different methods are
presented in the literature. One of the challenges of this problem is
extracting features that lead to better results. Experimental results show that
both global and local features are useful for this purpose. In this paper, we
propose a multi-scale image inpainting method that utilizes both local and
global features. The first step of this method is to determine how many scales
we need to use, which depends on the width of the lines in the map of the
missing region. Then we apply adaptive image inpainting to the damaged areas of
the image, and the lost pixels are predicted. Each scale is inpainted and the
result is resized to the original size. Then a voting process produces the
final result. The proposed method is tested on damaged images with scratches
and creases. The metric that we use to evaluate our approach is PSNR. On
average, we achieved 1.2 dB improvement over some existing inpainting
approaches.
Related papers
- Sketch-guided Image Inpainting with Partial Discrete Diffusion Process [5.005162730122933]
We introduce a novel partial discrete diffusion process (PDDP) for sketch-guided inpainting.
PDDP corrupts the masked regions of the image and reconstructs these masked regions conditioned on hand-drawn sketches.
The proposed novel transformer module accepts two inputs -- the image containing the masked region to be inpainted and the query sketch to model the reverse diffusion process.
arXiv Detail & Related papers (2024-04-18T07:07:38Z) - Learning to Rank Patches for Unbiased Image Redundancy Reduction [80.93989115541966]
Images suffer from heavy spatial redundancy because pixels in neighboring regions are spatially correlated.
Existing approaches strive to overcome this limitation by reducing less meaningful image regions.
We propose a self-supervised framework for image redundancy reduction called Learning to Rank Patches.
arXiv Detail & Related papers (2024-03-31T13:12:41Z) - Cylin-Painting: Seamless {360\textdegree} Panoramic Image Outpainting
and Beyond [136.18504104345453]
We present a Cylin-Painting framework that involves meaningful collaborations between inpainting and outpainting.
The proposed algorithm can be effectively extended to other panoramic vision tasks, such as object detection, depth estimation, and image super-resolution.
arXiv Detail & Related papers (2022-04-18T21:18:49Z) - RePaint: Inpainting using Denoising Diffusion Probabilistic Models [161.74792336127345]
Free-form inpainting is the task of adding new content to an image in the regions specified by an arbitrary binary mask.
We propose RePaint: A Denoising Probabilistic Model (DDPM) based inpainting approach that is applicable to even extreme masks.
We validate our method for both faces and general-purpose image inpainting using standard and extreme masks.
arXiv Detail & Related papers (2022-01-24T18:40:15Z) - Noise Doesn't Lie: Towards Universal Detection of Deep Inpainting [42.189768203036394]
We make the first attempt towards universal detection of deep inpainting, where the detection network can generalize well.
Our approach outperforms existing detection methods by a large margin and generalizes well to unseen deep inpainting techniques.
arXiv Detail & Related papers (2021-06-03T01:29:29Z) - Deep Two-Stage High-Resolution Image Inpainting [0.0]
In this article, we propose a method that solves the problem of inpainting arbitrary-size images.
For this, we propose to use information from neighboring pixels by shifting the original image in four directions.
This approach can work with existing inpainting models, making them almost resolution independent without the need for retraining.
arXiv Detail & Related papers (2021-04-27T20:32:21Z) - In&Out : Diverse Image Outpainting via GAN Inversion [89.84841983778672]
Image outpainting seeks for a semantically consistent extension of the input image beyond its available content.
In this work, we formulate the problem from the perspective of inverting generative adversarial networks.
Our generator renders micro-patches conditioned on their joint latent code as well as their individual positions in the image.
arXiv Detail & Related papers (2021-04-01T17:59:10Z) - Painting Outside as Inside: Edge Guided Image Outpainting via
Bidirectional Rearrangement with Progressive Step Learning [18.38266676724225]
We propose a novel image outpainting method using bidirectional boundary region rearrangement.
The proposed method is compared with other state-of-the-art outpainting and inpainting methods both qualitatively and quantitatively.
The experimental results demonstrate that our method outperforms other methods and generates new images with 360degpanoramic characteristics.
arXiv Detail & Related papers (2020-10-05T06:53:55Z) - Texture Memory-Augmented Deep Patch-Based Image Inpainting [121.41395272974611]
We propose a new deep inpainting framework where texture generation is guided by a texture memory of patch samples extracted from unmasked regions.
The framework has a novel design that allows texture memory retrieval to be trained end-to-end with the deep inpainting network.
The proposed method shows superior performance both qualitatively and quantitatively on three challenging image benchmarks.
arXiv Detail & Related papers (2020-09-28T12:09:08Z) - High-Resolution Image Inpainting with Iterative Confidence Feedback and
Guided Upsampling [122.06593036862611]
Existing image inpainting methods often produce artifacts when dealing with large holes in real applications.
We propose an iterative inpainting method with a feedback mechanism.
Experiments show that our method significantly outperforms existing methods in both quantitative and qualitative evaluations.
arXiv Detail & Related papers (2020-05-24T13:23:45Z) - Image Demoireing with Learnable Bandpass Filters [18.94907983950051]
We propose a novel multiscale bandpass convolutional neural network (MBCNN) to address this problem.
For texture restoration, we propose a learnable bandpass filter (LBF) to learn the frequency prior for moire texture removal.
For color restoration, we propose a two-step tone mapping strategy, which first applies a global tone mapping to correct for a global color shift, and then performs local fine tuning of the color per pixel.
arXiv Detail & Related papers (2020-04-01T12:57:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.