TY - JOUR
T1 - A Quantitative Insight Into the Role of Skip Connections in Deep Neural Networks of Low Complexity
T2 - A Case Study Directed at Fluid Flow Modeling
AU - Choubineh, Abouzar
AU - Chen, Jie
AU - Coenen, Frans
AU - Ma, Fei
N1 - Funding Information:
This work is partially supported by Key Program Special Fund in XJTLU (KSF-E-50, KSF-E-21), XJTLU Postgraduate Research Scholarship (PGRS1912009), and XJTLU Research Development Funding (RDF-19-01-15).
Publisher Copyright:
© 2023 American Society of Mechanical Engineers (ASME). All rights reserved.
PY - 2023/2
Y1 - 2023/2
N2 - Deep feed-forward networks, with high complexity, backpropagate the gradient of the loss function from final layers to earlier layers. As a consequence, the “gradient” may descend rapidly toward zero. This is known as the vanishing gradient phenomenon that prevents earlier layers from benefiting from further training. One of the most efficient techniques to solve this problem is using skip connection (shortcut) schemes that enable the gradient to be directly back-propagated to earlier layers. This paper investigates whether skip connections significantly affect the performance of deep neural networks of low complexity or whether their inclusion has little or no effect. The analysis was conducted using four Convolutional Neural Networks (CNNs) to predict four different multiscale basis functions for the mixed Generalized Multiscale Finite Element Method (GMsFEM). These models were applied to 249,375 samples. Three skip connection schemes were added to the base structure: Scheme 1 from the first convolutional block to the last, Scheme 2 from the middle to the last block, and Scheme 3 from the middle to the last and the second-to-last blocks. The results demonstrate that the third scheme is most effective, as it increases the coefficient of determination (R2) value by 0.0224–0.044 and decreases the Mean Squared Error (MSE) value by 0.0027–0.0058 compared to the base structure. Hence, it is concluded that enriching the last convolutional blocks with the information hidden in neighboring blocks is more effective than enriching using earlier convolutional blocks near the input layer.
AB - Deep feed-forward networks, with high complexity, backpropagate the gradient of the loss function from final layers to earlier layers. As a consequence, the “gradient” may descend rapidly toward zero. This is known as the vanishing gradient phenomenon that prevents earlier layers from benefiting from further training. One of the most efficient techniques to solve this problem is using skip connection (shortcut) schemes that enable the gradient to be directly back-propagated to earlier layers. This paper investigates whether skip connections significantly affect the performance of deep neural networks of low complexity or whether their inclusion has little or no effect. The analysis was conducted using four Convolutional Neural Networks (CNNs) to predict four different multiscale basis functions for the mixed Generalized Multiscale Finite Element Method (GMsFEM). These models were applied to 249,375 samples. Three skip connection schemes were added to the base structure: Scheme 1 from the first convolutional block to the last, Scheme 2 from the middle to the last block, and Scheme 3 from the middle to the last and the second-to-last blocks. The results demonstrate that the third scheme is most effective, as it increases the coefficient of determination (R2) value by 0.0224–0.044 and decreases the Mean Squared Error (MSE) value by 0.0027–0.0058 compared to the base structure. Hence, it is concluded that enriching the last convolutional blocks with the information hidden in neighboring blocks is more effective than enriching using earlier convolutional blocks near the input layer.
KW - backpropagation
KW - deep neural network
KW - GMsFEM
KW - heterogeneous porous media
KW - skip connection
KW - vanishing gradient phenomenon
UR - http://www.scopus.com/inward/record.url?scp=85144041011&partnerID=8YFLogxK
U2 - 10.1115/1.4054868
DO - 10.1115/1.4054868
M3 - Article
AN - SCOPUS:85144041011
SN - 1530-9827
VL - 23
SP - 1
EP - 21
JO - Journal of Computing and Information Science in Engineering
JF - Journal of Computing and Information Science in Engineering
IS - 1
M1 - 014502
ER -