TY - JOUR
T1 - Adaptive knowledge transfer for class incremental learning
AU - Feng, Zhikun
AU - Zhou, Mian
AU - Gao, Zan
AU - Stefanidis, Angelos
AU - Su, Jionglong
AU - Dang, Kang
AU - Li, Chuanhui
N1 - Publisher Copyright:
© 2024 Elsevier B.V.
PY - 2024/7
Y1 - 2024/7
N2 - Humans are excellent at adapting to constantly changing circumstances, but deep neural networks have catastrophic forgetting. Recently, significant progress has been made with class-incremental methods based on dynamic network structures. However, these methods combine individual networks in a simplistic coupled manner, neglecting the fusion capabilities between modules, leading to a decline in the overall prediction performance. To address this, we propose a class-incremental learning paradigm with adaptive knowledge transfer. This paradigm leverages crucial self-learning factors to transfer the importance knowledge of old classes as much as possible, allowing each module to integrate the optimal information from the current class. Experiments demonstrate that our designed adaptive knowledge transfer module effectively reduces the sharpness of decision boundaries, thereby significantly improving the final accuracy. Additionally, we have devised a compression module with supplementary learning to mitigate errors arising from long-term session sequences during model fusion. Extensive experiments conducted on benchmarks, Our approach achieved an average accuracy exceeding 2.72%, 1.30% in the CIFAR-100 and ImageNet-100/1000 achieves SOTA performance in both ordinary and challenging class-incremental settings.
AB - Humans are excellent at adapting to constantly changing circumstances, but deep neural networks have catastrophic forgetting. Recently, significant progress has been made with class-incremental methods based on dynamic network structures. However, these methods combine individual networks in a simplistic coupled manner, neglecting the fusion capabilities between modules, leading to a decline in the overall prediction performance. To address this, we propose a class-incremental learning paradigm with adaptive knowledge transfer. This paradigm leverages crucial self-learning factors to transfer the importance knowledge of old classes as much as possible, allowing each module to integrate the optimal information from the current class. Experiments demonstrate that our designed adaptive knowledge transfer module effectively reduces the sharpness of decision boundaries, thereby significantly improving the final accuracy. Additionally, we have devised a compression module with supplementary learning to mitigate errors arising from long-term session sequences during model fusion. Extensive experiments conducted on benchmarks, Our approach achieved an average accuracy exceeding 2.72%, 1.30% in the CIFAR-100 and ImageNet-100/1000 achieves SOTA performance in both ordinary and challenging class-incremental settings.
KW - Class incremental learning
KW - Dynamic network
KW - Knowledge distillation
KW - Knowledge sharing
UR - http://www.scopus.com/inward/record.url?scp=85194583329&partnerID=8YFLogxK
U2 - 10.1016/j.patrec.2024.05.011
DO - 10.1016/j.patrec.2024.05.011
M3 - Article
AN - SCOPUS:85194583329
SN - 0167-8655
VL - 183
SP - 165
EP - 171
JO - Pattern Recognition Letters
JF - Pattern Recognition Letters
ER -