TY - GEN
T1 - Decoupling Overlapped Feature Spaces
T2 - 2025 IEEE International Conference on Multimedia and Expo, ICME 2025
AU - Feng, Zhi Kun
AU - Wu, Mingyu
AU - Kuang, Ping
AU - Dang, Kang
AU - Zhou, Mian
AU - Yu, Liu
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - The goal of Class Incremental Learning (CIL) is to continuously learn new classes while preventing forgetting of old ones. Most previous works focused on reducing catastrophic forgetting from model's perspective. However, the model is not the only factor contributing to forgetting. In this paper, we take the perspective of class instances and find that fine-grained class increments can lead to feature overlap between classes, further reducing instance margins. We call this interesting phenomenon as Fine-grained class confusion effect in CIL. Since preserving instance margins is crucial for resisting forgetting, it is beneficial to maintain the margin amount as much as possible. To achieve this, we propose a general Gaussian decoupling classifier to enhance the discriminability of similar classes during incremental learning. Specifically, we decouple the features of different classes extracted by the backbone network into multiple independent Gaussian distributions. By directly integrating them into the features with weighted fusion, we introduce a regularization penalty that encourages minimizing the overlap of similar features, thus increasing the feature distance between classes. Extensive experiments show that our method effectively improves class separation and better preserves instance margins, ultimately alleviating forgetting. The improved model achieves better performance on CUB-200 and CARS-196.
AB - The goal of Class Incremental Learning (CIL) is to continuously learn new classes while preventing forgetting of old ones. Most previous works focused on reducing catastrophic forgetting from model's perspective. However, the model is not the only factor contributing to forgetting. In this paper, we take the perspective of class instances and find that fine-grained class increments can lead to feature overlap between classes, further reducing instance margins. We call this interesting phenomenon as Fine-grained class confusion effect in CIL. Since preserving instance margins is crucial for resisting forgetting, it is beneficial to maintain the margin amount as much as possible. To achieve this, we propose a general Gaussian decoupling classifier to enhance the discriminability of similar classes during incremental learning. Specifically, we decouple the features of different classes extracted by the backbone network into multiple independent Gaussian distributions. By directly integrating them into the features with weighted fusion, we introduce a regularization penalty that encourages minimizing the overlap of similar features, thus increasing the feature distance between classes. Extensive experiments show that our method effectively improves class separation and better preserves instance margins, ultimately alleviating forgetting. The improved model achieves better performance on CUB-200 and CARS-196.
KW - Class Incremental Learning
KW - Continual Learning
KW - Fine-grained classification
UR - https://www.scopus.com/pages/publications/105022598670
U2 - 10.1109/ICME59968.2025.11209441
DO - 10.1109/ICME59968.2025.11209441
M3 - Conference Proceeding
AN - SCOPUS:105022598670
T3 - Proceedings - IEEE International Conference on Multimedia and Expo
BT - 2025 IEEE International Conference on Multimedia and Expo
PB - IEEE Computer Society
Y2 - 30 June 2025 through 4 July 2025
ER -