Skin Tone Classification in Digital Images Using CNN For Make-Up and Color Recommendation
DOI:
https://doi.org/10.64878/jistics.v1i3.29Keywords:
CNN, EfficientNetB0, Image Classification, Recomendation, SkinTone, Deep LearningAbstract
Human skin tone variation is an obstacle in the development of a digital beauty product recommendation system. The purpose of this study is to categorize skin tone into three groups (Black, Brown, and White). Using a Convolutional Neural Network (CNN) based on the refined EfficientNetB0 architecture on a balanced dataset of 1,500 facial images, each class consisting of 500 images. All images in the dataset have been resized to 224 × 224 pixels to match the model input and ensure data uniformity and compatibility with the EfficientNetB0 model architecture used. The dataset used was obtained from the Kaggle platform and processed through the normalization and augmentation stages. It was then evaluated through the validation process using the 5-fold cross-validation method. This model achieved a total accuracy level of 88.67%, with the white category demonstrating precision (0.93), recall (0.95), and F1-score (0.94), as well as the highest AUC of 0.99, indicating very satisfactory performance. Additionally, this system can offer personalized beauty product recommendations, including foundation shades, lipstick colors, and clothing color palettes, tailored to specific skin tones. This method enhances the user experience by providing accurate recommendations that adapt to various lighting conditions, making it suitable for use on digital beauty platforms.
Downloads
References
W. Zhu and P. Sang, “Facial skin colour classification using machine learning and hyperspectral imaging data,” no. November 2021, pp. 509–520, 2022, doi: 10.1049/ipr2.12366. DOI: https://doi.org/10.1049/ipr2.12366
F. W. Prabowo et al., “Deteksi Warna Kulit Menggunakan Metode Deep Learning dengan CNN (Convolutional Neural Network) Untuk Menentukan Kecocokan Warna Kulit dan Warna Busana,” vol. 19, no. September, pp. 186–190, 2024. DOI: https://doi.org/10.30587/e-link.v19i2.8128
S. K. Mbatha and M. J. Booysen, “Skin Tone Estimation under Diverse Lighting Conditions,” pp. 1–25, 2024, doi: 10.3390/jimaging10050109. DOI: https://doi.org/10.3390/jimaging10050109
J. Lee et al., “Deep learning-based skin care product recommendation: A focus on cosmetic ingredient analysis and facial skin conditions,” no. August 2023, pp. 2066–2077, 2024, doi: 10.1111/jocd.16218. DOI: https://doi.org/10.1111/jocd.16218
S. Saiwaeo, S. Arwatchananukul, and L. Mungmai, “Heliyon Human skin type classification using image processing and deep learning approaches,” Heliyon, vol. 9, no. 11, p. e21176, 2023, doi: 10.1016/j.heliyon.2023.e21176. DOI: https://doi.org/10.1016/j.heliyon.2023.e21176
M. Alruwaili, “An Integrated Deep Learning Model with EfficientNet and ResNet for Accurate Multi-Class Skin Disease Classification,” Diagnostics, MDPI, pp. 1–18, 2025, doi: 10.3390/diagnostics15050551. DOI: https://doi.org/10.3390/diagnostics15050551
D. Kurniadi, H. Leslie, H. Spits, and W. Suparta, “Predicting Student Performance with Multi-Level Representation in an Intelligent Academic Recommender System using Backpropagation Neural Network,” ICIC Express Lett. Part B Appl., vol. 12, no. 10, pp. 883–890, 2021, doi: 10.24507/icicelb.12.10.883.
U. Yaseen, “Skin Tone Classification Dataset.”
N. A. Sundari et al., “Klasifikasi Jenis Kulit Wajah Menggunakan Metode Convolutional Neural Network (CNN) EfficientNet-B0,” vol. 8, no. 6, pp. 3180–3187, 2022.
M. Harahap et al., “Skin cancer classification using EfficientNet architecture,” vol. 13, no. 4, pp. 2716–2728, 2024, doi: 10.11591/eei.v13i4.7159. DOI: https://doi.org/10.11591/eei.v13i4.7159
P. Fayyadhila and Junaidi, “Implementasi Deep Learning Untuk Klasifikasi Citra Undertone menggunakan Algoritma Convolutional Neural Network,” vol. 1, no. 2, pp. 52–62, 2021. DOI: https://doi.org/10.20895/dinda.v1i2.366
B. R. Sahu, “Skin Disease Classification using fine-tuned Xception Deep Learning Technique,” vol. 186, no. 39, pp. 9–14, 2024. DOI: https://doi.org/10.5120/ijca2024923965
R. Chouhan and S. Patel, “A Fine Tune CNN Model for Human Skin Type Classification,” vol. 11, no. 11, pp. 264–274, 2024. DOI: https://doi.org/10.14445/23488379/IJEEE-V11I11P125
N. F. Nissa, A. Janiati, N. Cahya, and P. Astuti, “Application of Deep Learning Using Convolutional Neural Network (CNN) Method for Women’s Skin Classification,” vol. 8, no. 1, pp. 144–153, 2021, doi: 10.15294/sji.v8i1.26888. DOI: https://doi.org/10.15294/sji.v8i1.26888
H. Hartanto, “Comparative Analysis Of EfficientNet And ResNet Models In The Classification of Skin Cancer,” vol. 7, no. 2, pp. 69–84, 2024. DOI: https://doi.org/10.24167/proxies.v7i2.12468
M. Khomidov and J. Lee, “The Novel EfficientNet Architecture-Based System and Algorithm to Predict Complex Human Emotions,” 2024, doi: https://doi.org/10.3390/a17070285. DOI: https://doi.org/10.3390/a17070285
A. N. Fajrina, Z. H. Pradana, S. I. Purnama, and S. Romadhona, “Penerapan Arsitektur EfficientNet-B0 Pada Klasifikasi Leukimia Tipe Acute Lymphoblastik Leukimia,” vol. 6, no. 1, 2024. DOI: https://doi.org/10.30595/jrre.v6i1.22090
K. Nishino, “Skin patch based makeup finish assessment technique by deep,” no. December, pp. 1–9, 2023, doi: 10.1111/srt.13561. DOI: https://doi.org/10.1111/srt.13561
W. P. K., N. V. Chawla, K. W. Bowyer, and L. O. Hall, “SMOTE: Synthetic Minority Over-sampling Technique,” METHOMIKA J. Manaj. Inform. dan Komputerisasi Akunt., vol. 4, no. 1, pp. 67–72, 2002, doi: 10.46880/jmika.vol4no1.pp67-72. DOI: https://doi.org/10.46880/jmika.Vol4No1.pp67-72
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Nida Nurapipah, Siti Sarah Yuliana

This work is licensed under a Creative Commons Attribution 4.0 International License.
License:
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.







