Automatic Facial Skin Feature Detection for Everyone
- URL: http://arxiv.org/abs/2203.16056v1
- Date: Wed, 30 Mar 2022 04:52:54 GMT
- Title: Automatic Facial Skin Feature Detection for Everyone
- Authors: Qian Zheng, Ankur Purwar, Heng Zhao, Guang Liang Lim, Ling Li,
Debasish Behera, Qian Wang, Min Tan, Rizhao Cai, Jennifer Werner, Dennis Sng,
Maurice van Steensel, Weisi Lin, Alex C Kot
- Abstract summary: We present an automatic facial skin feature detection method that works across a variety of skin tones and age groups for selfies in the wild.
To be specific, we annotate the locations of acne, pigmentation, and wrinkle for selfie images with different skin tone colors, severity levels, and lighting conditions.
- Score: 60.31670960526022
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automatic assessment and understanding of facial skin condition have several
applications, including the early detection of underlying health problems,
lifestyle and dietary treatment, skin-care product recommendation, etc. Selfies
in the wild serve as an excellent data resource to democratize skin quality
assessment, but suffer from several data collection challenges.The key to
guaranteeing an accurate assessment is accurate detection of different skin
features. We present an automatic facial skin feature detection method that
works across a variety of skin tones and age groups for selfies in the wild. To
be specific, we annotate the locations of acne, pigmentation, and wrinkle for
selfie images with different skin tone colors, severity levels, and lighting
conditions. The annotation is conducted in a two-phase scheme with the help of
a dermatologist to train volunteers for annotation. We employ Unet++ as the
network architecture for feature detection. This work shows that the two-phase
annotation scheme can robustly detect the accurate locations of acne,
pigmentation, and wrinkle for selfie images with different ethnicities, skin
tone colors, severity levels, age groups, and lighting conditions.
Related papers
- FairSkin: Fair Diffusion for Skin Disease Image Generation [54.29840149709033]
Diffusion Model (DM) has become a leading method in generating synthetic medical images, but it suffers from a critical twofold bias.
We propose FairSkin, a novel DM framework that mitigates these biases through a three-level resampling mechanism.
Our approach significantly improves the diversity and quality of generated images, contributing to more equitable skin disease detection in clinical settings.
arXiv Detail & Related papers (2024-10-29T21:37:03Z) - Equitable Skin Disease Prediction Using Transfer Learning and Domain Adaptation [1.9505972437091028]
Existing artificial intelligence (AI) models in dermatology face challenges in accurately diagnosing diseases across diverse skin tones.
We employ a transfer-learning approach that capitalizes on the rich, transferable knowledge from various image domains.
Among all methods, Med-ViT emerged as the top performer due to its comprehensive feature representation learned from diverse image sources.
arXiv Detail & Related papers (2024-09-01T23:48:26Z) - Optimizing Skin Lesion Classification via Multimodal Data and Auxiliary
Task Integration [54.76511683427566]
This research introduces a novel multimodal method for classifying skin lesions, integrating smartphone-captured images with essential clinical and demographic information.
A distinctive aspect of this method is the integration of an auxiliary task focused on super-resolution image prediction.
The experimental evaluations have been conducted using the PAD-UFES20 dataset, applying various deep-learning architectures.
arXiv Detail & Related papers (2024-02-16T05:16:20Z) - DDI-CoCo: A Dataset For Understanding The Effect Of Color Contrast In
Machine-Assisted Skin Disease Detection [51.92255321684027]
We study the interaction between skin tone and color difference effects and suggest that color difference can be an additional reason behind model performance bias between skin tones.
Our work provides a complementary angle to dermatology AI for improving skin disease detection.
arXiv Detail & Related papers (2024-01-24T07:45:24Z) - FaceSkin: A Privacy Preserving Facial skin patch Dataset for multi
Attributes classification [0.9282594860064426]
We introduce a dataset called FaceSkin, which encompasses a diverse range of ages and races.
We incorporate synthetic skin-patches obtained from 2D and 3D attack images, including printed paper, replays, and 3D masks.
We evaluate the FaceSkin dataset across distinct categories and present experimental results demonstrating its effectiveness in attribute classification.
arXiv Detail & Related papers (2023-08-09T07:53:33Z) - CIAO! A Contrastive Adaptation Mechanism for Non-Universal Facial
Expression Recognition [80.07590100872548]
We propose Contrastive Inhibitory Adaptati On (CIAO), a mechanism that adapts the last layer of facial encoders to depict specific affective characteristics on different datasets.
CIAO presents an improvement in facial expression recognition performance over six different datasets with very unique affective representations.
arXiv Detail & Related papers (2022-08-10T15:46:05Z) - Towards Intrinsic Common Discriminative Features Learning for Face
Forgery Detection using Adversarial Learning [59.548960057358435]
We propose a novel method which utilizes adversarial learning to eliminate the negative effect of different forgery methods and facial identities.
Our face forgery detection model learns to extract common discriminative features through eliminating the effect of forgery methods and facial identities.
arXiv Detail & Related papers (2022-07-08T09:23:59Z) - Analysis of Manual and Automated Skin Tone Assignments for Face
Recognition Applications [8.334167427229572]
We analyze a set of manual Fitzpatrick skin type assignments and also employ the individual typology angle to automatically estimate the skin tone from face images.
The level of agreement between automated and manual approaches is found to be 96% or better for the MORPH images.
arXiv Detail & Related papers (2021-04-29T22:35:47Z) - Evaluating Deep Neural Networks Trained on Clinical Images in
Dermatology with the Fitzpatrick 17k Dataset [0.23746609573239755]
This dataset includes 16,577 clinical images sourced from two dermatology atlases with Fitzpatrick skin type labels.
We train a deep neural network model to classify 114 skin conditions and find that the model is most accurate on skin types similar to those it was trained on.
arXiv Detail & Related papers (2021-04-20T13:37:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.