Deep Edge-Aware Interactive Colorization against Color-Bleeding Effects
- URL: http://arxiv.org/abs/2107.01619v1
- Date: Sun, 4 Jul 2021 13:14:31 GMT
- Title: Deep Edge-Aware Interactive Colorization against Color-Bleeding Effects
- Authors: Eungyeup Kim, Sanghyeon Lee, Jeonghoon Park, Somi Choi, Choonghyun
Seo, Jaegul Choo
- Abstract summary: Deep image colorization networks often suffer from the color-bleeding artifact.
We propose a novel edge-enhancing framework for the regions of interest, by utilizing user scribbles that indicate where to enhance.
- Score: 15.386085970550996
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep image colorization networks often suffer from the color-bleeding
artifact, a problematic color spreading near the boundaries between adjacent
objects. The color-bleeding artifacts debase the reality of generated outputs,
limiting the applicability of colorization models on a practical application.
Although previous approaches have tackled this problem in an automatic manner,
they often generate imperfect outputs because their enhancements are available
only in limited cases, such as having a high contrast of gray-scale value in an
input image. Instead, leveraging user interactions would be a promising
approach, since it can help the edge correction in the desired regions. In this
paper, we propose a novel edge-enhancing framework for the regions of interest,
by utilizing user scribbles that indicate where to enhance. Our method requires
minimal user effort to obtain satisfactory enhancements. Experimental results
on various datasets demonstrate that our interactive approach has outstanding
performance in improving color-bleeding artifacts against the existing
baselines.
Related papers
- Learning Inclusion Matching for Animation Paint Bucket Colorization [76.4507878427755]
We introduce a new learning-based inclusion matching pipeline, which directs the network to comprehend the inclusion relationships between segments.
Our method features a two-stage pipeline that integrates a coarse color warping module with an inclusion matching module.
To facilitate the training of our network, we also develope a unique dataset, referred to as PaintBucket-Character.
arXiv Detail & Related papers (2024-03-27T08:32:48Z) - Control Color: Multimodal Diffusion-based Interactive Image Colorization [81.68817300796644]
Control Color (Ctrl Color) is a multi-modal colorization method that leverages the pre-trained Stable Diffusion (SD) model.
We present an effective way to encode user strokes to enable precise local color manipulation.
We also introduce a novel module based on self-attention and a content-guided deformable autoencoder to address the long-standing issues of color overflow and inaccurate coloring.
arXiv Detail & Related papers (2024-02-16T17:51:13Z) - Diffusing Colors: Image Colorization with Text Guided Diffusion [11.727899027933466]
We present a novel image colorization framework that utilizes image diffusion techniques with granular text prompts.
Our method provides a balance between automation and control, outperforming existing techniques in terms of visual quality and semantic coherence.
Our approach holds potential particularly for color enhancement and historical image colorization.
arXiv Detail & Related papers (2023-12-07T08:59:20Z) - Easing Color Shifts in Score-Based Diffusion Models [0.0]
We quantify the performance of a nonlinear bypass connection in the score network.
We show that this network architecture substantially improves the resulting quality of the generated images.
arXiv Detail & Related papers (2023-06-27T23:33:30Z) - BiSTNet: Semantic Image Prior Guided Bidirectional Temporal Feature
Fusion for Deep Exemplar-based Video Colorization [70.14893481468525]
We present an effective BiSTNet to explore colors of reference exemplars and utilize them to help video colorization.
We first establish the semantic correspondence between each frame and the reference exemplars in deep feature space to explore color information from reference exemplars.
We develop a mixed expert block to extract semantic information for modeling the object boundaries of frames so that the semantic image prior can better guide the colorization process.
arXiv Detail & Related papers (2022-12-05T13:47:15Z) - Guiding Users to Where to Give Color Hints for Efficient Interactive
Sketch Colorization via Unsupervised Region Prioritization [31.750591990768307]
This paper proposes a novel model-guided deep interactive colorization framework that reduces the required amount of user interactions.
Our method, called GuidingPainter, prioritizes these regions where the model most needs a color hint, rather than just relying on the user's manual decision on where to give a color hint.
arXiv Detail & Related papers (2022-10-25T18:50:09Z) - PalGAN: Image Colorization with Palette Generative Adversarial Networks [51.59276436217957]
We propose a new GAN-based colorization approach PalGAN, integrated with palette estimation and chromatic attention.
PalGAN outperforms state-of-the-arts in quantitative evaluation and visual comparison, delivering notable diverse, contrastive, and edge-preserving appearances.
arXiv Detail & Related papers (2022-10-20T12:28:31Z) - iColoriT: Towards Propagating Local Hint to the Right Region in
Interactive Colorization by Leveraging Vision Transformer [29.426206281291755]
We present iColoriT, a novel point-interactive colorization Vision Transformer capable of propagating user hints to relevant regions.
Our approach colorizes images in real-time by utilizing pixel shuffling, an efficient upsampling technique that replaces the decoder architecture.
arXiv Detail & Related papers (2022-07-14T11:40:32Z) - Detecting Recolored Image by Spatial Correlation [60.08643417333974]
Image recoloring is an emerging editing technique that can manipulate the color values of an image to give it a new style.
In this paper, we explore a solution from the perspective of the spatial correlation, which exhibits the generic detection capability for both conventional and deep learning-based recoloring.
Our method achieves the state-of-the-art detection accuracy on multiple benchmark datasets and exhibits well generalization for unknown types of recoloring methods.
arXiv Detail & Related papers (2022-04-23T01:54:06Z) - Towards Vivid and Diverse Image Colorization with Generative Color Prior [17.087464490162073]
Recent deep-learning-based methods could automatically colorize images at a low cost.
We aim at recovering vivid colors by leveraging the rich and diverse color priors encapsulated in a pretrained Generative Adversarial Networks (GAN)
Thanks to the powerful generative color prior and delicate designs, our method could produce vivid colors with a single forward pass.
arXiv Detail & Related papers (2021-08-19T17:49:21Z) - Image Colorization: A Survey and Dataset [94.59768013860668]
This article presents a comprehensive survey of state-of-the-art deep learning-based image colorization techniques.
It categorizes the existing colorization techniques into seven classes and discusses important factors governing their performance.
We perform an extensive experimental evaluation of existing image colorization methods using both existing datasets and our proposed one.
arXiv Detail & Related papers (2020-08-25T01:22:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.