AN EXPLAINABLE ARTIFICIAL INTELLIGENCE ENSEMBLE FRAMEWORK FOR ROBUST CARCINOMA IMAGE CLASSIFICATION
Keywords:
Explainable Artificial Intelligence, Ensemble Learning, Carcinoma cancer, Deep Learning, Soft VotingAbstract
This study proposes an explainable ensemble deep learning framework for carcinoma image classification using five pretrained convolutional neural network architectures: ResNet50, DenseNet201, MobileNetV2, EfficientNetB0, and Xception. The models were fine-tuned using the HAM10000 dermoscopic image dataset and integrated through a soft voting ensemble strategy to enhance classification robustness. An additional 'Unknown' class was introduced to allow the system handle non‑skin or irrelevant images in real-world deployment. Experimental evaluation shows that the best performing model (Xception) achieved 89% accuracy, 88% precision, 86% recall, and 87% F1‑score. To improve interpretability, Explainable Artificial Intelligence (XAI) techniques including Grad‑CAM, SHAP, LIME, Saliency Maps, and Integrated Gradients were applied to highlight regions influencing model predictions. The results demonstrate that integrating ensemble learning with explainable AI improves diagnostic transparency and reliability, making the approach suitable for clinical decision-support systems.




