BREAST LESION SEGMENTATION AND CLASSIFICATION USING U-NET SALIENCY ESTIMATION AND EXPLAINABLE RESIDUAL CONVOLUTIONAL NEURAL NETWORK

Mamuna Fatima, Muhammad Attique Khan, Saima Shaheen, Hussain Mobarak Albarakati, Shuihua Wang, Syeda Fizzah Jilani*, Mohammad Shabaz*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Breast cancer (BrC) is one of the most common causes of death among women worldwide. Images of the breast (mammography or ultrasound) may show an anomaly that represents early indicators of BrC. However, accurate breast image interpretation necessitates labor-intensive procedures and highly skilled medical professionals. As a second opinion for the physician, deep learning (DL) tools can be useful for the diagnosis and classification of malignant and benign lesions. However, due to the lack of interpretability of DL algorithms, it is not easy to understand by experts as to how to predict a label. In this work, we proposed multitask U-Net Saliency estimation and DL model-based breast lesion segmentation and classification using ultrasound images. A new contrast enhancement technique is proposed to improve the quality of original images. After that, a new technique was proposed called UNET-Saliency map for the segmentation of breast lesions. Simultaneously, a MobileNetV2 deep model is fine-tuned with additional residual blocks and trained from scratch using original and enhanced images. The purpose of additional blocks is to reduce the number of parameters and better learning of ultrasound images. Training is performed from scratch and extracted features from the deeper layers of both models. In the later step, a new cross-entropy controlled sine-cosine algorithm is developed and selected best features. The main purpose of this step is the reduction of irrelevant features for the classification phase. The selected features are fused in the next step by employing a serial-based Manhattan Distance (SbMD) approach and classified the resultant vector using machine learning classifiers. The results indicate that a wide neural network (WNN) obtained the highest accuracy of 98.9% and sensitivity rate of 98.70% on the selected breast ultrasound image dataset. The comparison of the proposed method accuracy is conducted with state-of-the-art (SoArt) techniques which show the improved performance.

Original languageEnglish
Article number2440060
JournalFractals
DOIs
Publication statusAccepted/In press - 2024

Keywords

  • Breast Cancer
  • Classification
  • Contrast Stretching
  • Feature Optimization
  • Fusion
  • Image Classification
  • Image Processing
  • Saliency Map
  • U-NET

Fingerprint

Dive into the research topics of 'BREAST LESION SEGMENTATION AND CLASSIFICATION USING U-NET SALIENCY ESTIMATION AND EXPLAINABLE RESIDUAL CONVOLUTIONAL NEURAL NETWORK'. Together they form a unique fingerprint.

Cite this