TY - JOUR
T1 - CTMLP
T2 - Can MLPs replace CNNs or transformers for COVID-19 diagnosis?
AU - Sun, Junding
AU - Pi, Pengpeng
AU - Tang, Chaosheng
AU - Wang, Shui Hua
AU - Zhang, Yu Dong
N1 - Publisher Copyright:
© 2023 The Author(s)
PY - 2023/6
Y1 - 2023/6
N2 - Background: Convolutional Neural Networks (CNNs) and the hybrid models of CNNs and Vision Transformers (VITs) are the recent mainstream methods for COVID-19 medical image diagnosis. However, pure CNNs lack global modeling ability, and the hybrid models of CNNs and VITs have problems such as large parameters and computational complexity. These models are difficult to be used effectively for medical diagnosis in just-in-time applications. Methods: Therefore, a lightweight medical diagnosis network CTMLP based on convolutions and multi-layer perceptrons (MLPs) is proposed for the diagnosis of COVID-19. The previous self-supervised algorithms are based on CNNs and VITs, and the effectiveness of such algorithms for MLPs is not yet known. At the same time, due to the lack of ImageNet-scale datasets in the medical image domain for model pre-training. So, a pre-training scheme TL-DeCo based on transfer learning and self-supervised learning was constructed. In addition, TL-DeCo is too tedious and resource-consuming to build a new model each time. Therefore, a guided self-supervised pre-training scheme was constructed for the new lightweight model pre-training. Results: The proposed CTMLP achieves an accuracy of 97.51%, an f1-score of 97.43%, and a recall of 98.91% without pre-training, even with only 48% of the number of ResNet50 parameters. Furthermore, the proposed guided self-supervised learning scheme can improve the baseline of simple self-supervised learning by 1%–1.27%. Conclusion: The final results show that the proposed CTMLP can replace CNNs or Transformers for a more efficient diagnosis of COVID-19. In addition, the additional pre-training framework was developed to make it more promising in clinical practice.
AB - Background: Convolutional Neural Networks (CNNs) and the hybrid models of CNNs and Vision Transformers (VITs) are the recent mainstream methods for COVID-19 medical image diagnosis. However, pure CNNs lack global modeling ability, and the hybrid models of CNNs and VITs have problems such as large parameters and computational complexity. These models are difficult to be used effectively for medical diagnosis in just-in-time applications. Methods: Therefore, a lightweight medical diagnosis network CTMLP based on convolutions and multi-layer perceptrons (MLPs) is proposed for the diagnosis of COVID-19. The previous self-supervised algorithms are based on CNNs and VITs, and the effectiveness of such algorithms for MLPs is not yet known. At the same time, due to the lack of ImageNet-scale datasets in the medical image domain for model pre-training. So, a pre-training scheme TL-DeCo based on transfer learning and self-supervised learning was constructed. In addition, TL-DeCo is too tedious and resource-consuming to build a new model each time. Therefore, a guided self-supervised pre-training scheme was constructed for the new lightweight model pre-training. Results: The proposed CTMLP achieves an accuracy of 97.51%, an f1-score of 97.43%, and a recall of 98.91% without pre-training, even with only 48% of the number of ResNet50 parameters. Furthermore, the proposed guided self-supervised learning scheme can improve the baseline of simple self-supervised learning by 1%–1.27%. Conclusion: The final results show that the proposed CTMLP can replace CNNs or Transformers for a more efficient diagnosis of COVID-19. In addition, the additional pre-training framework was developed to make it more promising in clinical practice.
KW - CNNs
KW - COVID-19
KW - Guided self-supervised learning
KW - MLPs
KW - Transfer learning
KW - Transformers
UR - http://www.scopus.com/inward/record.url?scp=85152596994&partnerID=8YFLogxK
U2 - 10.1016/j.compbiomed.2023.106847
DO - 10.1016/j.compbiomed.2023.106847
M3 - Article
C2 - 37068316
AN - SCOPUS:85152596994
SN - 0010-4825
VL - 159
JO - Computers in Biology and Medicine
JF - Computers in Biology and Medicine
M1 - 106847
ER -