TY - JOUR
T1 - DenseNet-201-Based Deep Neural Network with Composite Learning Factor and Precomputation for Multiple Sclerosis Classification
AU - Wang, Shui Hua
AU - Zhang, Yu Dong
N1 - Publisher Copyright:
© 2020 ACM.
PY - 2020/7
Y1 - 2020/7
N2 - (Aim) Multiple sclerosis is a neurological condition that may cause neurologic disability. Convolutional neural network can achieve good results, but tuning hyperparameters of CNN needs expert knowledge and are difficult and time-consuming. To identify multiple sclerosis more accurately, this article proposed a new transfer-learning-based approach. (Method) DenseNet-121, DenseNet-169, and DenseNet-201 neural networks were compared. In addition, we proposed the use of a composite learning factor (CLF) that assigns different learning factor to three types of layers: Early frozen layers, middle layers, and late replaced layers. How to allocate layers into those three layers remains a problem. Hence, four transfer learning settings (viz., Settings A, B, C, and D) were tested and compared. A precomputation method was utilized to reduce the storage burden and accelerate the program. (Results) We observed that DenseNet-201-D (the layers from CP to T3 are frozen, the layers of D4 are updated with learning factor of 1, and the final new layers of FCL are randomly initialized with learning factor of 10) can achieve the best performance. The sensitivity, specificity, and accuracy of DenseNet-201-D was 98.27± 0.58, 98.35± 0.69, and 98.31± 0.53, respectively. (Conclusion) Our method gives better performances than state-of-the-art approaches. Furthermore, this composite learning rate gives superior results to traditional simple learning factor (SLF) strategy.
AB - (Aim) Multiple sclerosis is a neurological condition that may cause neurologic disability. Convolutional neural network can achieve good results, but tuning hyperparameters of CNN needs expert knowledge and are difficult and time-consuming. To identify multiple sclerosis more accurately, this article proposed a new transfer-learning-based approach. (Method) DenseNet-121, DenseNet-169, and DenseNet-201 neural networks were compared. In addition, we proposed the use of a composite learning factor (CLF) that assigns different learning factor to three types of layers: Early frozen layers, middle layers, and late replaced layers. How to allocate layers into those three layers remains a problem. Hence, four transfer learning settings (viz., Settings A, B, C, and D) were tested and compared. A precomputation method was utilized to reduce the storage burden and accelerate the program. (Results) We observed that DenseNet-201-D (the layers from CP to T3 are frozen, the layers of D4 are updated with learning factor of 1, and the final new layers of FCL are randomly initialized with learning factor of 10) can achieve the best performance. The sensitivity, specificity, and accuracy of DenseNet-201-D was 98.27± 0.58, 98.35± 0.69, and 98.31± 0.53, respectively. (Conclusion) Our method gives better performances than state-of-the-art approaches. Furthermore, this composite learning rate gives superior results to traditional simple learning factor (SLF) strategy.
KW - DenseNet
KW - Multiple sclerosis
KW - composite learning factor
KW - deep learning
KW - deep neural network
KW - precomputation
KW - simple learning factor
KW - transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85088953408&partnerID=8YFLogxK
U2 - 10.1145/3341095
DO - 10.1145/3341095
M3 - Article
AN - SCOPUS:85088953408
SN - 1551-6857
VL - 16
JO - ACM Transactions on Multimedia Computing, Communications and Applications
JF - ACM Transactions on Multimedia Computing, Communications and Applications
IS - 2s
M1 - 60
ER -