Position and Orientation Aware One-Shot Learning for Medical Action Recognition from Signal Data

Leiyu Xie*, Yuxing Yang, Zeyu Fu, Syed Mohsen Naqvi

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In this work, we propose a position and orientationaware one-shot learning framework for medical action recognition from signal data. The proposed framework comprises two stages and each stage includes signal-level image generation (SIG), cross-attention (CsA), and dynamic time warping (DTW) modules and the information fusion between the proposed privacy-preserved position and orientation features. The proposed SIG method aims to transform the raw skeleton data into privacy-preserved features for training. The CsA module is developed to guide the network in reducing medical action recognition bias and more focusing on important human body parts for each specific action, aimed at addressing similar medical action related issues. Moreover, the DTW module is employed to minimize temporal mismatching between instances and further improve model performance. Furthermore, the proposed privacypreserved orientation-level features are utilized to assist the position-level features in both of the two stages for enhancing medical action recognition performance. Extensive experimental results on the widely-used and well-known NTU RGB+D 60, NTU RGB+D 120, and PKU-MMD datasets all demonstrate the effectiveness of the proposed method, which outperforms the other state-of-the-art methods with general dataset partitioning by 2.7%, 6.2% and 4.1%, respectively.

Original languageEnglish
JournalIEEE Transactions on Multimedia
DOIs
Publication statusAccepted/In press - 2024
Externally publishedYes

Keywords

  • attention mechanism
  • feature fusion
  • healthcare
  • medical action recognition
  • One-shot learning

Fingerprint

Dive into the research topics of 'Position and Orientation Aware One-Shot Learning for Medical Action Recognition from Signal Data'. Together they form a unique fingerprint.

Cite this