Perceptual Artifacts Localization for Inpainting
- URL: http://arxiv.org/abs/2208.03357v1
- Date: Fri, 5 Aug 2022 18:50:51 GMT
- Title: Perceptual Artifacts Localization for Inpainting
- Authors: Lingzhi Zhang, Yuqian Zhou, Connelly Barnes, Sohrab Amirghodsi, Zhe
Lin, Eli Shechtman, Jianbo Shi
- Abstract summary: We propose a new learning task of automatic segmentation of inpainting perceptual artifacts.
We train advanced segmentation networks on a dataset to reliably localize inpainting artifacts within inpainted images.
We also propose a new evaluation metric called Perceptual Artifact Ratio (PAR), which is the ratio of objectionable inpainted regions to the entire inpainted area.
- Score: 60.5659086595901
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Image inpainting is an essential task for multiple practical applications
like object removal and image editing. Deep GAN-based models greatly improve
the inpainting performance in structures and textures within the hole, but
might also generate unexpected artifacts like broken structures or color blobs.
Users perceive these artifacts to judge the effectiveness of inpainting models,
and retouch these imperfect areas to inpaint again in a typical retouching
workflow. Inspired by this workflow, we propose a new learning task of
automatic segmentation of inpainting perceptual artifacts, and apply the model
for inpainting model evaluation and iterative refinement. Specifically, we
first construct a new inpainting artifacts dataset by manually annotating
perceptual artifacts in the results of state-of-the-art inpainting models. Then
we train advanced segmentation networks on this dataset to reliably localize
inpainting artifacts within inpainted images. Second, we propose a new
interpretable evaluation metric called Perceptual Artifact Ratio (PAR), which
is the ratio of objectionable inpainted regions to the entire inpainted area.
PAR demonstrates a strong correlation with real user preference. Finally, we
further apply the generated masks for iterative image inpainting by combining
our approach with multiple recent inpainting methods. Extensive experiments
demonstrate the consistent decrease of artifact regions and inpainting quality
improvement across the different methods.
Related papers
- Dense Feature Interaction Network for Image Inpainting Localization [28.028361409524457]
Inpainting can be used to conceal or alter image contents in malicious manipulation of images.
Existing methods mostly rely on a basic encoder-decoder structure, which often results in a high number of false positives.
In this paper, we describe a new method for inpainting detection based on a Dense Feature Interaction Network (DeFI-Net)
arXiv Detail & Related papers (2024-08-05T02:35:13Z) - RefFusion: Reference Adapted Diffusion Models for 3D Scene Inpainting [63.567363455092234]
RefFusion is a novel 3D inpainting method based on a multi-scale personalization of an image inpainting diffusion model to the given reference view.
Our framework achieves state-of-the-art results for object removal while maintaining high controllability.
arXiv Detail & Related papers (2024-04-16T17:50:02Z) - Fill in the ____ (a Diffusion-based Image Inpainting Pipeline) [0.0]
Inpainting is the process of taking an image and generating lost or intentionally occluded portions.
Modern inpainting techniques have shown remarkable ability in generating sensible completions.
A critical gap in these existing models will be addressed, focusing on the ability to prompt and control what exactly is generated.
arXiv Detail & Related papers (2024-03-24T05:26:55Z) - Stroke-based Neural Painting and Stylization with Dynamically Predicted
Painting Region [66.75826549444909]
Stroke-based rendering aims to recreate an image with a set of strokes.
We propose Compositional Neural Painter, which predicts the painting region based on the current canvas.
We extend our method to stroke-based style transfer with a novel differentiable distance transform loss.
arXiv Detail & Related papers (2023-09-07T06:27:39Z) - Line Drawing Guided Progressive Inpainting of Mural Damage [18.768636785377645]
We propose a line drawing guided progressive mural inpainting method.
It divides the inpainting process into two steps: structure reconstruction and color correction.
The proposed approach is evaluated against the current state-of-the-art image inpainting methods.
arXiv Detail & Related papers (2022-11-12T12:22:11Z) - Learning Prior Feature and Attention Enhanced Image Inpainting [63.21231753407192]
This paper incorporates the pre-training based Masked AutoEncoder (MAE) into the inpainting model.
We propose to use attention priors from MAE to make the inpainting model learn more long-distance dependencies between masked and unmasked regions.
arXiv Detail & Related papers (2022-08-03T04:32:53Z) - Improve Deep Image Inpainting by Emphasizing the Complexity of Missing
Regions [20.245637164975594]
In this paper, we enhance the deep image inpainting models with the help of classical image complexity metrics.
A knowledge-assisted index composed of missingness complexity and forward loss is presented to guide the batch selection in the training procedure.
We experimentally demonstrate the improvements for several recently developed image inpainting models on various datasets.
arXiv Detail & Related papers (2022-02-13T09:14:52Z) - Restore from Restored: Single-image Inpainting [9.699531255678856]
We present a novel and efficient self-supervised fine-tuning algorithm for inpainting networks.
We update the parameters of the pre-trained inpainting networks by utilizing existing self-similar patches.
We achieve state-of-the-art inpainting results on publicly available benchmark datasets.
arXiv Detail & Related papers (2021-10-25T11:38:51Z) - In&Out : Diverse Image Outpainting via GAN Inversion [89.84841983778672]
Image outpainting seeks for a semantically consistent extension of the input image beyond its available content.
In this work, we formulate the problem from the perspective of inverting generative adversarial networks.
Our generator renders micro-patches conditioned on their joint latent code as well as their individual positions in the image.
arXiv Detail & Related papers (2021-04-01T17:59:10Z) - High-Resolution Image Inpainting with Iterative Confidence Feedback and
Guided Upsampling [122.06593036862611]
Existing image inpainting methods often produce artifacts when dealing with large holes in real applications.
We propose an iterative inpainting method with a feedback mechanism.
Experiments show that our method significantly outperforms existing methods in both quantitative and qualitative evaluations.
arXiv Detail & Related papers (2020-05-24T13:23:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.