TY - JOUR
T1 - BREAST LESION SEGMENTATION AND CLASSIFICATION USING U-NET SALIENCY ESTIMATION AND EXPLAINABLE RESIDUAL CONVOLUTIONAL NEURAL NETWORK
AU - Fatima, Mamuna
AU - Khan, Muhammad Attique
AU - Shaheen, Saima
AU - Albarakati, Hussain Mobarak
AU - Wang, Shuihua
AU - Jilani, Syeda Fizzah
AU - Shabaz, Mohammad
N1 - Publisher Copyright:
© The Author(s)
PY - 2024
Y1 - 2024
N2 - Breast cancer (BrC) is one of the most common causes of death among women worldwide. Images of the breast (mammography or ultrasound) may show an anomaly that represents early indicators of BrC. However, accurate breast image interpretation necessitates labor-intensive procedures and highly skilled medical professionals. As a second opinion for the physician, deep learning (DL) tools can be useful for the diagnosis and classification of malignant and benign lesions. However, due to the lack of interpretability of DL algorithms, it is not easy to understand by experts as to how to predict a label. In this work, we proposed multitask U-Net Saliency estimation and DL model-based breast lesion segmentation and classification using ultrasound images. A new contrast enhancement technique is proposed to improve the quality of original images. After that, a new technique was proposed called UNET-Saliency map for the segmentation of breast lesions. Simultaneously, a MobileNetV2 deep model is fine-tuned with additional residual blocks and trained from scratch using original and enhanced images. The purpose of additional blocks is to reduce the number of parameters and better learning of ultrasound images. Training is performed from scratch and extracted features from the deeper layers of both models. In the later step, a new cross-entropy controlled sine-cosine algorithm is developed and selected best features. The main purpose of this step is the reduction of irrelevant features for the classification phase. The selected features are fused in the next step by employing a serial-based Manhattan Distance (SbMD) approach and classified the resultant vector using machine learning classifiers. The results indicate that a wide neural network (WNN) obtained the highest accuracy of 98.9% and sensitivity rate of 98.70% on the selected breast ultrasound image dataset. The comparison of the proposed method accuracy is conducted with state-of-the-art (SoArt) techniques which show the improved performance.
AB - Breast cancer (BrC) is one of the most common causes of death among women worldwide. Images of the breast (mammography or ultrasound) may show an anomaly that represents early indicators of BrC. However, accurate breast image interpretation necessitates labor-intensive procedures and highly skilled medical professionals. As a second opinion for the physician, deep learning (DL) tools can be useful for the diagnosis and classification of malignant and benign lesions. However, due to the lack of interpretability of DL algorithms, it is not easy to understand by experts as to how to predict a label. In this work, we proposed multitask U-Net Saliency estimation and DL model-based breast lesion segmentation and classification using ultrasound images. A new contrast enhancement technique is proposed to improve the quality of original images. After that, a new technique was proposed called UNET-Saliency map for the segmentation of breast lesions. Simultaneously, a MobileNetV2 deep model is fine-tuned with additional residual blocks and trained from scratch using original and enhanced images. The purpose of additional blocks is to reduce the number of parameters and better learning of ultrasound images. Training is performed from scratch and extracted features from the deeper layers of both models. In the later step, a new cross-entropy controlled sine-cosine algorithm is developed and selected best features. The main purpose of this step is the reduction of irrelevant features for the classification phase. The selected features are fused in the next step by employing a serial-based Manhattan Distance (SbMD) approach and classified the resultant vector using machine learning classifiers. The results indicate that a wide neural network (WNN) obtained the highest accuracy of 98.9% and sensitivity rate of 98.70% on the selected breast ultrasound image dataset. The comparison of the proposed method accuracy is conducted with state-of-the-art (SoArt) techniques which show the improved performance.
KW - Breast Cancer
KW - Classification
KW - Contrast Stretching
KW - Feature Optimization
KW - Fusion
KW - Image Classification
KW - Image Processing
KW - Saliency Map
KW - U-NET
UR - http://www.scopus.com/inward/record.url?scp=85210980475&partnerID=8YFLogxK
U2 - 10.1142/S0218348X24400607
DO - 10.1142/S0218348X24400607
M3 - Article
AN - SCOPUS:85210980475
SN - 0218-348X
JO - Fractals
JF - Fractals
M1 - 2440060
ER -