A novel momentum prototypical neural network to cross-domain fault diagnosis for rotating machinery subject to cold-start

Xiaohan Chen, Rui Yang*, Yihao Xue, Chao Yang, Baoye Song, Maiying Zhong

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)

Abstract

Cross-domain rotating machinery fault diagnosis has achieved great success recently with the development of deep transfer learning. However, conventional deep transfer learning methods encounter a severe decline in prediction accuracy when fault samples are limited. Moreover, conventional deep transfer learning methods require additional parameter tuning rather than cold-start when applied to the target tasks, hampering their implementation in practical fault diagnosis applications. In this paper, a novel method, named momentum prototypical neural network (MoProNet), is proposed for cross-domain few-shot rotating machinery fault diagnosis. The MoProNet progressively updates the support encoder to address the prototype oscillation problem and enable the model to apply limited source domain samples to predict target domain faults with cold-start. The performance of the proposed MoProNet is tested on a bearing dataset and a hardware-in-the-loop high-speed train simulation platform, respectively, with over forty cross-domain few-shot fault diagnosis tasks. The experimental results demonstrate that the proposed MoProNet achieves satisfactory results and outperforms the other comparable methods in the same cross-domain few-shot scenarios with the simple AlexNet backbone.

Original languageEnglish
Article number126656
JournalNeurocomputing
Volume555
DOIs
Publication statusPublished - 28 Oct 2023

Keywords

  • Cold-start
  • Cross-domain
  • Fault diagnosis
  • Few-shot
  • Transfer learning

Fingerprint

Dive into the research topics of 'A novel momentum prototypical neural network to cross-domain fault diagnosis for rotating machinery subject to cold-start'. Together they form a unique fingerprint.

Cite this