TY - GEN
T1 - Deep Active Learning Image Classification Algorithm Based on Class-Wise Self-Knowledge Distillation
AU - Pang, Yuliang
AU - Liu, Ying
AU - Hao, Yu
AU - Gong, Yanchao
AU - Li, Daxiang
AU - Xu, Zhijie
N1 - Publisher Copyright:
© 2023 ACM.
PY - 2023/9/22
Y1 - 2023/9/22
N2 - The success of image classification techniques based on deep learning relies heavily on a large amount of labeled data. However, the cost of data annotation is often expensive. This paper investigates active learning algorithms for image classification to reduce the cost of data annotation. However, traditional active learning algorithms for image classification suffer from overfitting issues due to training deep neural networks with millions of parameters using a small amount of labeled data. This makes it difficult for the model to effectively assess the information richness of unknown unlabeled samples and is detrimental to the selection of high-value samples. To alleviate these issues, this paper proposes a deep active learning-based image classification algorithm with class-wise self-knowledge distillation. This algorithm reduces overfitting and class-wise variations by matching or distilling the predictive distribution between different samples with the same label during training, thereby enabling a more accurate evaluation of the informativeness of unlabeled data by the active learning algorithm and improving the performance of the classification model. Additionally, an efficient Shuffle Attention mechanism is introduced to improve the sample selection strategy by combining spatial and channel feature information of the images. The proposed algorithm is compared with five active learning baselines on CIFAR10, CIFAR100, SVHN, and FashionMNIST datasets. Experimental results demonstrate the proposed algorithm exhibits superior classification performance.
AB - The success of image classification techniques based on deep learning relies heavily on a large amount of labeled data. However, the cost of data annotation is often expensive. This paper investigates active learning algorithms for image classification to reduce the cost of data annotation. However, traditional active learning algorithms for image classification suffer from overfitting issues due to training deep neural networks with millions of parameters using a small amount of labeled data. This makes it difficult for the model to effectively assess the information richness of unknown unlabeled samples and is detrimental to the selection of high-value samples. To alleviate these issues, this paper proposes a deep active learning-based image classification algorithm with class-wise self-knowledge distillation. This algorithm reduces overfitting and class-wise variations by matching or distilling the predictive distribution between different samples with the same label during training, thereby enabling a more accurate evaluation of the informativeness of unlabeled data by the active learning algorithm and improving the performance of the classification model. Additionally, an efficient Shuffle Attention mechanism is introduced to improve the sample selection strategy by combining spatial and channel feature information of the images. The proposed algorithm is compared with five active learning baselines on CIFAR10, CIFAR100, SVHN, and FashionMNIST datasets. Experimental results demonstrate the proposed algorithm exhibits superior classification performance.
KW - Active Learning
KW - Image Classification
KW - Self-Knowledge Distillation
KW - Shuffle Attention
UR - http://www.scopus.com/inward/record.url?scp=85197362965&partnerID=8YFLogxK
U2 - 10.1145/3641584.3641615
DO - 10.1145/3641584.3641615
M3 - Conference Proceeding
AN - SCOPUS:85197362965
T3 - ACM International Conference Proceeding Series
SP - 205
EP - 211
BT - AIPR 2023 - 6th International Conference on Artificial Intelligence and Pattern Recognition
PB - Association for Computing Machinery
T2 - 6th International Conference on Artificial Intelligence and Pattern Recognition, AIPR 2023
Y2 - 22 September 2023 through 24 September 2023
ER -