Skin Lesion Classification Using a Soft Voting Ensemble of Convolutional Neural Networks
- URL: http://arxiv.org/abs/2512.20431v1
- Date: Tue, 23 Dec 2025 15:20:47 GMT
- Title: Skin Lesion Classification Using a Soft Voting Ensemble of Convolutional Neural Networks
- Authors: Abdullah Al Shafi, Abdul Muntakim, Pintu Chandra Shill, Rowzatul Zannat, Abdullah Al-Amin,
- Abstract summary: This paper presents an early skin cancer classification method using a soft voting ensemble of CNNs.<n>The method achieved lesion recognition accuracies of 96.32%, 90.86%, and 93.92% for the three datasets.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Skin cancer can be identified by dermoscopic examination and ocular inspection, but early detection significantly increases survival chances. Artificial intelligence (AI), using annotated skin images and Convolutional Neural Networks (CNNs), improves diagnostic accuracy. This paper presents an early skin cancer classification method using a soft voting ensemble of CNNs. In this investigation, three benchmark datasets, namely HAM10000, ISIC 2016, and ISIC 2019, were used. The process involved rebalancing, image augmentation, and filtering techniques, followed by a hybrid dual encoder for segmentation via transfer learning. Accurate segmentation focused classification models on clinically significant features, reducing background artifacts and improving accuracy. Classification was performed through an ensemble of MobileNetV2, VGG19, and InceptionV3, balancing accuracy and speed for real-world deployment. The method achieved lesion recognition accuracies of 96.32\%, 90.86\%, and 93.92\% for the three datasets. The system performance was evaluated using established skin lesion detection metrics, yielding impressive results.
Related papers
- A Machine Vision Approach to Preliminary Skin Lesion Assessments [0.0]
This study evaluates a comprehensive system for preliminary skin lesion assessment that combines the clinically established ABCD rule of dermoscopy with machine learning classification.<n>A custom three-layer Convolutional Neural Network (CNN) trained from scratch achieved 78.5% accuracy and 86.5% recall on median-filtered images.
arXiv Detail & Related papers (2026-01-21T23:48:59Z) - GS-TransUNet: Integrated 2D Gaussian Splatting and Transformer UNet for Accurate Skin Lesion Analysis [44.99833362998488]
We present a novel approach that combines 2D Gaussian splatting with the Transformer UNet architecture for automated skin cancer diagnosis.<n>Our findings illustrate significant advancements in the precision of segmentation and classification.<n>This integration sets new benchmarks in the field and highlights the potential for further research into multi-task medical image analysis methodologies.
arXiv Detail & Related papers (2025-02-23T23:28:47Z) - Advanced Hybrid Deep Learning Model for Enhanced Classification of Osteosarcoma Histopathology Images [0.0]
This study focuses on osteosarcoma (OS), the most common bone cancer in children and adolescents, which affects the long bones of the arms and legs.
We propose a novel hybrid model that combines convolutional neural networks (CNN) and vision transformers (ViT) to improve diagnostic accuracy for OS.
The model achieved an accuracy of 99.08%, precision of 99.10%, recall of 99.28%, and an F1-score of 99.23%.
arXiv Detail & Related papers (2024-10-29T13:54:08Z) - Brain Tumor Classification on MRI in Light of Molecular Markers [56.99710477905796]
Co-deletion of the 1p/19q gene is associated with clinical outcomes in low-grade gliomas.<n>This study aims to utilize a specially MRI-based convolutional neural network for brain cancer detection.
arXiv Detail & Related papers (2024-09-29T07:04:26Z) - Breast Cancer Image Classification Method Based on Deep Transfer Learning [40.392772795903795]
A breast cancer image classification model algorithm combining deep learning and transfer learning is proposed.
Experimental results demonstrate that the algorithm achieves an efficiency of over 84.0% in the test set, with a significantly improved classification accuracy compared to previous models.
arXiv Detail & Related papers (2024-04-14T12:09:47Z) - Breast Ultrasound Tumor Classification Using a Hybrid Multitask
CNN-Transformer Network [63.845552349914186]
Capturing global contextual information plays a critical role in breast ultrasound (BUS) image classification.
Vision Transformers have an improved capability of capturing global contextual information but may distort the local image patterns due to the tokenization operations.
In this study, we proposed a hybrid multitask deep neural network called Hybrid-MT-ESTAN, designed to perform BUS tumor classification and segmentation.
arXiv Detail & Related papers (2023-08-04T01:19:32Z) - Reduced Deep Convolutional Activation Features (R-DeCAF) in
Histopathology Images to Improve the Classification Performance for Breast
Cancer Diagnosis [0.0]
Breast cancer is the second most common cancer among women worldwide.
Deep convolutional neural networks (CNNs) are effective solutions.
The features extracted from activation layer of pre-trained CNNs are called deep convolutional activation features (DeCAF)
arXiv Detail & Related papers (2023-01-05T06:53:46Z) - Mediastinal Lymph Node Detection and Segmentation Using Deep Learning [1.7188280334580195]
In clinical practice, computed tomography (CT) and positron emission tomography (PET) imaging detect abnormal lymph nodes (LNs)
Deep convolutional neural networks frequently segment items in medical photographs.
A well-established deep learning technique UNet was modified using bilinear and total generalized variation (TGV) based up strategy to segment and detect mediastinal lymph nodes.
The modified UNet maintains texture discontinuities, selects noisy areas, searches appropriate balance points through backpropagation, and recreates image resolution.
arXiv Detail & Related papers (2022-11-24T02:55:20Z) - Lightweight 3D Convolutional Neural Network for Schizophrenia diagnosis
using MRI Images and Ensemble Bagging Classifier [1.487444917213389]
This paper proposed a lightweight 3D convolutional neural network (CNN) based framework for schizophrenia diagnosis using MRI images.
The model achieves the highest accuracy 92.22%, sensitivity 94.44%, specificity 90%, precision 90.43%, recall 94.44%, F1-score 92.39% and G-mean 92.19% as compared to the current state-of-the-art techniques.
arXiv Detail & Related papers (2022-11-05T10:27:37Z) - Ensemble of CNN classifiers using Sugeno Fuzzy Integral Technique for
Cervical Cytology Image Classification [1.6986898305640261]
We propose a fully automated computer-aided diagnosis tool for classifying single-cell and slide images of cervical cancer.
We use the Sugeno Fuzzy Integral to ensemble the decision scores from three popular deep learning models, namely, Inception v3, DenseNet-161 and ResNet-34.
arXiv Detail & Related papers (2021-08-21T08:41:41Z) - Wide & Deep neural network model for patch aggregation in CNN-based
prostate cancer detection systems [51.19354417900591]
Prostate cancer (PCa) is one of the leading causes of death among men, with almost 1.41 million new cases and around 375,000 deaths in 2020.
To perform an automatic diagnosis, prostate tissue samples are first digitized into gigapixel-resolution whole-slide images.
Small subimages called patches are extracted and predicted, obtaining a patch-level classification.
arXiv Detail & Related papers (2021-05-20T18:13:58Z) - Classification of COVID-19 in CT Scans using Multi-Source Transfer
Learning [91.3755431537592]
We propose the use of Multi-Source Transfer Learning to improve upon traditional Transfer Learning for the classification of COVID-19 from CT scans.
With our multi-source fine-tuning approach, our models outperformed baseline models fine-tuned with ImageNet.
Our best performing model was able to achieve an accuracy of 0.893 and a Recall score of 0.897, outperforming its baseline Recall score by 9.3%.
arXiv Detail & Related papers (2020-09-22T11:53:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.