TY - JOUR
T1 - You look from old classes
T2 - Towards accurate few shot class-incremental learning
AU - Hu, Yijie
AU - Huang, Kaizhu
AU - Wang, Wei
AU - Huang, Xiaowei
AU - Wang, Qiufeng
N1 - Publisher Copyright:
© 2025
PY - 2026/4
Y1 - 2026/4
N2 - Few-shot class incremental learning (FSCIL) is a common but difficult task that faces two challenges: catastrophic forgetting of old classes and insufficient learning of new classes with limited samples. Recent wisdom focuses on preventing catastrophic forgetting yet overlooks the limited samples issue, resulting in poor new class performance. In this paper, we argue that old class samples contain rich knowledge, which can be exploited to supplement the learning of new classes. To this end, we propose to Look from Old Classes (YLOC) for FSCIL, enhancing both the base and incremental sessions. In the base session, we develop a prototype centered loss (PCL) to obtain a compact distribution of old classes. During incremental sessions, we devise a prototype augmentation learning (PAL) method to aid the learning of new classes by exploiting old classes. Extensive experiments on three FSCIL benchmark datasets demonstrate the superiority of our method.
AB - Few-shot class incremental learning (FSCIL) is a common but difficult task that faces two challenges: catastrophic forgetting of old classes and insufficient learning of new classes with limited samples. Recent wisdom focuses on preventing catastrophic forgetting yet overlooks the limited samples issue, resulting in poor new class performance. In this paper, we argue that old class samples contain rich knowledge, which can be exploited to supplement the learning of new classes. To this end, we propose to Look from Old Classes (YLOC) for FSCIL, enhancing both the base and incremental sessions. In the base session, we develop a prototype centered loss (PCL) to obtain a compact distribution of old classes. During incremental sessions, we devise a prototype augmentation learning (PAL) method to aid the learning of new classes by exploiting old classes. Extensive experiments on three FSCIL benchmark datasets demonstrate the superiority of our method.
KW - Catastrophic forgetting
KW - Class incremental learning
KW - Few-shot class incremental learning
KW - Few-shot learning
KW - Prototype learning
UR - https://www.scopus.com/pages/publications/105014544373
U2 - 10.1016/j.patcog.2025.112352
DO - 10.1016/j.patcog.2025.112352
M3 - Article
AN - SCOPUS:105014544373
SN - 0031-3203
VL - 172
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 112352
ER -