On selection of centroids of fuzzy clusters for color classification
- URL: http://arxiv.org/abs/2407.17423v1
- Date: Tue, 9 Jul 2024 11:59:31 GMT
- Title: On selection of centroids of fuzzy clusters for color classification
- Authors: Dae-Won Kim, Kwang H. Lee,
- Abstract summary: A novel fuzzy c-means algorithm is proposed for the color clustering problem.
The proposed algorithm extracts dominant colors that are the most vivid and distinguishable colors.
To obtain the dominant colors and their closest color points, we introduce reference colors and define a fuzzy membership model.
- Score: 2.002741592555996
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: A novel initialization method in the fuzzy c-means (FCM) algorithm is proposed for the color clustering problem. Given a set of color points, the proposed initialization extracts dominant colors that are the most vivid and distinguishable colors. Color points closest to the dominant colors are selected as initial centroids in the FCM. To obtain the dominant colors and their closest color points, we introduce reference colors and define a fuzzy membership model between a color point and a reference color.
Related papers
- Paint Bucket Colorization Using Anime Character Color Design Sheets [72.66788521378864]
We introduce inclusion matching, which allows the network to understand the relationships between segments.
Our network's training pipeline significantly improves performance in both colorization and consecutive frame colorization.
To support our network's training, we have developed a unique dataset named PaintBucket-Character.
arXiv Detail & Related papers (2024-10-25T09:33:27Z) - Fuzzy color model and clustering algorithm for color clustering problem [2.002741592555996]
We have tried to model the inherent uncertainty and vagueness of color data using fuzzy color model.
With the fuzzy color model, we developed a new fuzzy clustering algorithm for an efficient partition of color data.
arXiv Detail & Related papers (2024-07-09T11:53:54Z) - Palette-based Color Transfer between Images [9.471264982229508]
We propose a new palette-based color transfer method that can automatically generate a new color scheme.
With a redesigned palette-based clustering method, pixels can be classified into different segments according to color distribution.
Our method exhibits significant advantages over peer methods in terms of natural realism, color consistency, generality, and robustness.
arXiv Detail & Related papers (2024-05-14T01:41:19Z) - Automatic Controllable Colorization via Imagination [55.489416987587305]
We propose a framework for automatic colorization that allows for iterative editing and modifications.
By understanding the content within a grayscale image, we utilize a pre-trained image generation model to generate multiple images that contain the same content.
These images serve as references for coloring, mimicking the process of human experts.
arXiv Detail & Related papers (2024-04-08T16:46:07Z) - Control Color: Multimodal Diffusion-based Interactive Image Colorization [81.68817300796644]
Control Color (Ctrl Color) is a multi-modal colorization method that leverages the pre-trained Stable Diffusion (SD) model.
We present an effective way to encode user strokes to enable precise local color manipulation.
We also introduce a novel module based on self-attention and a content-guided deformable autoencoder to address the long-standing issues of color overflow and inaccurate coloring.
arXiv Detail & Related papers (2024-02-16T17:51:13Z) - DARC: Distribution-Aware Re-Coloring Model for Generalizable Nucleus
Segmentation [68.43628183890007]
We argue that domain gaps can also be caused by different foreground (nucleus)-background ratios.
First, we introduce a re-coloring method that relieves dramatic image color variations between different domains.
Second, we propose a new instance normalization method that is robust to the variation in the foreground-background ratios.
arXiv Detail & Related papers (2023-09-01T01:01:13Z) - BiSTNet: Semantic Image Prior Guided Bidirectional Temporal Feature
Fusion for Deep Exemplar-based Video Colorization [70.14893481468525]
We present an effective BiSTNet to explore colors of reference exemplars and utilize them to help video colorization.
We first establish the semantic correspondence between each frame and the reference exemplars in deep feature space to explore color information from reference exemplars.
We develop a mixed expert block to extract semantic information for modeling the object boundaries of frames so that the semantic image prior can better guide the colorization process.
arXiv Detail & Related papers (2022-12-05T13:47:15Z) - ABANICCO: A New Color Space for Multi-Label Pixel Classification and
Color Segmentation [1.7205106391379026]
We propose a novel method combining geometric analysis of color theory, fuzzy color spaces, and multi-label systems for the automatic classification of pixels according to 12 standard color categories.
We present a robust, unsupervised, unbiased strategy for color naming based on statistics and color theory.
arXiv Detail & Related papers (2022-11-15T19:26:51Z) - PalGAN: Image Colorization with Palette Generative Adversarial Networks [51.59276436217957]
We propose a new GAN-based colorization approach PalGAN, integrated with palette estimation and chromatic attention.
PalGAN outperforms state-of-the-arts in quantitative evaluation and visual comparison, delivering notable diverse, contrastive, and edge-preserving appearances.
arXiv Detail & Related papers (2022-10-20T12:28:31Z) - Perceptual Robust Hashing for Color Images with Canonical Correlation
Analysis [21.22196411212803]
We propose a novel perceptual image hashing scheme for color images based on ring-ribbon quadtree and color vector angle.
Our scheme has satisfactory performances with respect to robustness, discrimination and security, which can be effectively used in copy detection and content authentication.
arXiv Detail & Related papers (2020-12-08T09:35:21Z) - Reference-Based Video Colorization with Spatiotemporal Correspondence [8.472559058510205]
We propose a reference-based video colorization framework with temporal correspondence.
By restricting temporally-related regions for referencing colors, our approach propagates faithful colors throughout the video.
arXiv Detail & Related papers (2020-11-25T05:47:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.