FAD: Feature augmented distillation for anomaly detection and localization

Qiyin Zhong, Xianglin Qiu, Xinqiao Zhao, Xiaowei Huang, Gang Liu*, Jimin Xiao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Anomaly Detection (AD) is vital for quality control in industrial manufacturing, but obtaining sufficient anomalous data for supervised learning is challenging. Unsupervised AD methods, which use only normal data, are a practical alternative. However, these methods, including memory-bank-based and knowledge-distillation-based techniques, often misclassify rare normal textures as anomalies, a problem we term tailed texture misdirection, which we are the first to identify. To address this, we propose a Feature Augmented Distillation (FAD) framework that conducts feature augmentation for normal tailed textures. Our approach involves selecting under-fitted layers and generate Gaussian-Perturbed High Heterogeneity (GPHH) features on the selected layer to mimic the normal tailed textures. Then we conduct re-learning for the GPHH features, which improves adaptability of the model to normal tailed texture and reduces tailed texture misdirection. Experimental results on MVTec AD and ViSA benchmarks show that FAD achieves competitive performances compared to state-of-the-art approaches, particularly for detecting long-tailed normal textures. The code will be released.

Original languageEnglish
Article number128249
JournalExpert Systems with Applications
Volume288
DOIs
Publication statusPublished - 1 Sept 2025

Keywords

  • Anomaly detection
  • Feature augmentation
  • Unsupervised learning

Fingerprint

Dive into the research topics of 'FAD: Feature augmented distillation for anomaly detection and localization'. Together they form a unique fingerprint.

Cite this