TFPred: Learning discriminative representations from unlabeled data for few-label rotating machinery fault diagnosis

Xiaohan Chen, Rui Yang*, Yihao Xue, Baoye Song, Zidong Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

Recent advances in intelligent rotating machinery fault diagnosis have been enabled by the availability of massive labeled training data. However, in practical industrial applications, it is often challenging and costly to annotate a large amount of data. To address the few-label fault diagnosis problem, a time–frequency prediction (TFPred) self-supervised learning framework is proposed to extract latent fault representations from unlabeled fault data. Specifically, the TFPred framework consists of a time encoder and a frequency encoder, with the frequency encoder to predict the low-dimensional representations of time domain signals generated by the time encoder with randomly augmented data. Subsequently, the pre-trained network is hooked with a classification head and fine-tuned with limited labeled data. Finally, the proposed framework is evaluated on a run-to-failure bearing dataset and a hardware-in-the-loop high-speed train simulation platform. The experiments demonstrate that the self-supervised learning framework TFPred achieved competitive performance with only 1% and 5% labeled data. Code is available at https://github.com/Xiaohan-Chen/TFPred.

Original languageEnglish
Article number105900
JournalControl Engineering Practice
Volume146
DOIs
Publication statusPublished - May 2024

Keywords

  • Contrastive learning
  • Fault diagnosis
  • Few-labeled data
  • Self-supervised learning
  • Weakly label

Fingerprint

Dive into the research topics of 'TFPred: Learning discriminative representations from unlabeled data for few-label rotating machinery fault diagnosis'. Together they form a unique fingerprint.

Cite this