EleGANt: Exquisite and Locally Editable GAN for Makeup Transfer
- URL: http://arxiv.org/abs/2207.09840v1
- Date: Wed, 20 Jul 2022 11:52:07 GMT
- Title: EleGANt: Exquisite and Locally Editable GAN for Makeup Transfer
- Authors: Chenyu Yang, Wanrong He, Yingqing Xu, Yang Gao
- Abstract summary: We propose Exquisite and locally editable GAN for makeup transfer (EleGANt)
It encodes facial attributes into pyramidal feature maps to preserves high-frequency information.
EleGANt is the first to achieve customized local editing within arbitrary areas by corresponding editing on the feature maps.
- Score: 13.304362849679391
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most existing methods view makeup transfer as transferring color
distributions of different facial regions and ignore details such as eye
shadows and blushes. Besides, they only achieve controllable transfer within
predefined fixed regions. This paper emphasizes the transfer of makeup details
and steps towards more flexible controls. To this end, we propose Exquisite and
locally editable GAN for makeup transfer (EleGANt). It encodes facial
attributes into pyramidal feature maps to preserves high-frequency information.
It uses attention to extract makeup features from the reference and adapt them
to the source face, and we introduce a novel Sow-Attention Module that applies
attention within shifted overlapped windows to reduce the computational cost.
Moreover, EleGANt is the first to achieve customized local editing within
arbitrary areas by corresponding editing on the feature maps. Extensive
experiments demonstrate that EleGANt generates realistic makeup faces with
exquisite details and achieves state-of-the-art performance. The code is
available at https://github.com/Chenyu-Yang-2000/EleGANt.
Related papers
- Stable-Makeup: When Real-World Makeup Transfer Meets Diffusion Model [35.01727715493926]
Current makeup transfer methods are limited to simple makeup styles, making them difficult to apply in real-world scenarios.
We introduce Stable-Makeup, a novel diffusion-based makeup transfer method capable of robustly transferring a wide range of real-world makeup.
arXiv Detail & Related papers (2024-03-12T15:53:14Z) - SARA: Controllable Makeup Transfer with Spatial Alignment and Region-Adaptive Normalization [67.90315365909244]
We propose a novel Spatial Alignment and Region-Adaptive normalization method (SARA) in this paper.
Our method generates detailed makeup transfer results that can handle large spatial misalignments and achieve part-specific and shade-controllable makeup transfer.
Experimental results show that our SARA method outperforms existing methods and achieves state-of-the-art performance on two public datasets.
arXiv Detail & Related papers (2023-11-28T14:46:51Z) - LC-NeRF: Local Controllable Face Generation in Neural Randiance Field [55.54131820411912]
LC-NeRF is composed of a Local Region Generators Module and a Spatial-Aware Fusion Module.
Our method provides better local editing than state-of-the-art face editing methods.
Our method also performs well in downstream tasks, such as text-driven facial image editing.
arXiv Detail & Related papers (2023-02-19T05:50:08Z) - BeautyREC: Robust, Efficient, and Content-preserving Makeup Transfer [73.39598356799974]
We propose a Robust, Efficient, and Component-specific makeup transfer method (abbreviated as BeautyREC)
A component-specific correspondence to directly transfer the makeup style of a reference image to the corresponding components.
As an auxiliary, the long-range visual dependencies of Transformer are introduced for effective global makeup transfer.
arXiv Detail & Related papers (2022-12-12T12:38:27Z) - FEAT: Face Editing with Attention [70.89233432407305]
We build on the StyleGAN generator and present a method that explicitly encourages face manipulation to focus on the intended regions.
During the generation of the edited image, the attention map serves as a mask that guides a blending between the original features and the modified ones.
arXiv Detail & Related papers (2022-02-06T06:07:34Z) - PSGAN++: Robust Detail-Preserving Makeup Transfer and Removal [176.47249346856393]
PSGAN++ is capable of performing both detail-preserving makeup transfer and effective makeup removal.
For makeup transfer, PSGAN++ uses a Makeup Distill Network to extract makeup information.
For makeup removal, PSGAN++ applies an Identity Distill Network to embed the identity information from with-makeup images into identity matrices.
arXiv Detail & Related papers (2021-05-26T04:37:57Z) - Facial Attribute Transformers for Precise and Robust Makeup Transfer [79.41060385695977]
We propose a novel Facial Attribute Transformer (FAT) and its variant Spatial FAT for high-quality makeup transfer.
FAT is able to model the semantic correspondences and interactions between the source face and reference face, and then precisely estimate and transfer the facial attributes.
We also integrate thin plate splines (TPS) into FAT, thus creating Spatial FAT, which is the first method that can transfer geometric attributes in addition to color and texture.
arXiv Detail & Related papers (2021-04-07T03:39:02Z) - Lipstick ain't enough: Beyond Color Matching for In-the-Wild Makeup
Transfer [20.782984081934213]
We propose a holistic makeup transfer framework that can handle all the mentioned makeup components.
It consists of an improved color transfer branch and a novel pattern transfer branch to learn all makeup properties.
Our framework achieves the state of the art performance on both light and extreme makeup styles.
arXiv Detail & Related papers (2021-04-05T12:12:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.