AN EXPLAINABLE ARTIFICIAL INTELLIGENCE ENSEMBLE FRAMEWORK FOR ROBUST CARCINOMA IMAGE CLASSIFICATION

Authors

  • Ja'afar Muhammad Bello
  • Dr. M S Argungu
  • Dr. H U Suru

Keywords:

Explainable Artificial Intelligence, Ensemble Learning, Carcinoma cancer, Deep Learning, Soft Voting

Abstract

This study proposes an explainable ensemble deep learning framework for carcinoma image classification using five pretrained convolutional neural network architectures: ResNet50, DenseNet201, MobileNetV2, EfficientNetB0, and Xception. The models were fine-tuned using the HAM10000 dermoscopic image dataset and integrated through a soft voting ensemble strategy to enhance classification robustness. An additional 'Unknown' class was introduced to allow the system handle non‑skin or irrelevant images in real-world deployment. Experimental evaluation shows that the best performing model (Xception) achieved 89% accuracy, 88% precision, 86% recall, and 87% F1‑score. To improve interpretability, Explainable Artificial Intelligence (XAI) techniques including Grad‑CAM, SHAP, LIME, Saliency Maps, and Integrated Gradients were applied to highlight regions influencing model predictions. The results demonstrate that integrating ensemble learning with explainable AI improves diagnostic transparency and reliability, making the approach suitable for clinical decision-support systems.

Downloads

Published

2026-03-11

How to Cite

Ja’afar Muhammad Bello, Dr. M S Argungu, & Dr. H U Suru. (2026). AN EXPLAINABLE ARTIFICIAL INTELLIGENCE ENSEMBLE FRAMEWORK FOR ROBUST CARCINOMA IMAGE CLASSIFICATION. BW Academic Journal. Retrieved from https://mail.bwjournal.org/index.php/bsjournal/article/view/3826