TY - JOUR
T1 - BGRD-TransUNet
T2 - A Novel TransUNet-Based Model for Ultrasound Breast Lesion Segmentation
AU - Ji, Zhanlin
AU - Sun, Haoran
AU - Yuan, Na
AU - Zhang, Haiyang
AU - Sheng, Jiaxi
AU - Zhang, Xueji
AU - Ganchev, Ivan
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2024
Y1 - 2024
N2 - Breast UltraSound (BUS) imaging is a commonly used diagnostic tool in the field of counter fighting breast diseases, especially for early detection and diagnosis of breast cancer. Due to the inherent characteristics of ultrasound images such as blurry boundaries and diverse tumor morphologies, it is challenging for doctors to manually segment breast tumors. In recent years, the Convolutional Neural Network (CNN) technology has been widely applied to automatically segment BUS images. However, due to the inherent limitations of CNNs in capturing global contextual information, it is difficult to capture the full context. To address this issue, the paper proposes a novel BGRD-TransUNet model for breast lesion segmentation, based on TransUNet. The proposed model, first, replaces the original ResNet50 backbone network of TransUNet with DenseNet121 for initial feature extraction. Next, newly designed Residual Multi-Scale Feature Modules (RMSFMs) are employed to extract features from various layers of DenseNet121, thus capturing richer features within specific layers. Thirdly, a Boundary Guidance (BG) network is added to enhance the contour information of BUS images. Additionally, newly designed Boundary Attentional Feature Fusion Modules (BAFFMs) are used to integrate edge information and features extracted through RMSFMs. Finally, newly designed Parallel Channel and Spatial Attention Modules (PCSAMs) are used to refine feature extraction using channel and spatial attention. An extensive experimental testing performed on two public datasets demonstrates that the proposed BGRD-TransUNet model outperforms all state-of-the-art medical image segmentation models, participating in the experiments, according to all evaluation metrics used (except for few separate cases), including the two most important and widely used metrics in the field of medical image segmentation, namely the Intersection over Union (IoU) and Dice Similarity Coefficient (DSC). More specifically, on the BUSI dataset and dataset B, BGRD-TransUNet achieves IoU values of 76.77% and 86.61%, and DSC values of 85.08% and 92.47%, respectively, which are higher by 7.27 and 3.64, and 5.81 and 2.54 percentage points, than the corresponding values achieved by the baseline (TransUNet).
AB - Breast UltraSound (BUS) imaging is a commonly used diagnostic tool in the field of counter fighting breast diseases, especially for early detection and diagnosis of breast cancer. Due to the inherent characteristics of ultrasound images such as blurry boundaries and diverse tumor morphologies, it is challenging for doctors to manually segment breast tumors. In recent years, the Convolutional Neural Network (CNN) technology has been widely applied to automatically segment BUS images. However, due to the inherent limitations of CNNs in capturing global contextual information, it is difficult to capture the full context. To address this issue, the paper proposes a novel BGRD-TransUNet model for breast lesion segmentation, based on TransUNet. The proposed model, first, replaces the original ResNet50 backbone network of TransUNet with DenseNet121 for initial feature extraction. Next, newly designed Residual Multi-Scale Feature Modules (RMSFMs) are employed to extract features from various layers of DenseNet121, thus capturing richer features within specific layers. Thirdly, a Boundary Guidance (BG) network is added to enhance the contour information of BUS images. Additionally, newly designed Boundary Attentional Feature Fusion Modules (BAFFMs) are used to integrate edge information and features extracted through RMSFMs. Finally, newly designed Parallel Channel and Spatial Attention Modules (PCSAMs) are used to refine feature extraction using channel and spatial attention. An extensive experimental testing performed on two public datasets demonstrates that the proposed BGRD-TransUNet model outperforms all state-of-the-art medical image segmentation models, participating in the experiments, according to all evaluation metrics used (except for few separate cases), including the two most important and widely used metrics in the field of medical image segmentation, namely the Intersection over Union (IoU) and Dice Similarity Coefficient (DSC). More specifically, on the BUSI dataset and dataset B, BGRD-TransUNet achieves IoU values of 76.77% and 86.61%, and DSC values of 85.08% and 92.47%, respectively, which are higher by 7.27 and 3.64, and 5.81 and 2.54 percentage points, than the corresponding values achieved by the baseline (TransUNet).
KW - Breast disease
KW - TransUNet
KW - breast ultrasound (BUS)
KW - medical image segmentation
KW - tumor segmentation
UR - http://www.scopus.com/inward/record.url?scp=85186068657&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2024.3368170
DO - 10.1109/ACCESS.2024.3368170
M3 - Article
AN - SCOPUS:85186068657
SN - 2169-3536
VL - 12
SP - 31182
EP - 31196
JO - IEEE Access
JF - IEEE Access
ER -