Isolated sign language recognition using Convolutional Neural Network hand modelling and Hand Energy Image

Kian Ming Lim*, Alan Wee Chiat Tan, Chin Poo Lee, Shing Chiang Tan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

49 Citations (Scopus)

Abstract

This paper presents an isolated sign language recognition system that comprises of two main phases: hand tracking and hand representation. In the hand tracking phase, an annotated hand dataset is used to extract the hand patches to pre-train Convolutional Neural Network (CNN) hand models. The hand tracking is performed by the particle filter that combines hand motion and CNN pre-trained hand models into a joint likelihood observation model. The predicted hand position corresponds to the location of the particle with the highest joint likelihood. Based on the predicted hand position, a square hand region centered around the predicted position is segmented and serves as the input to the hand representation phase. In the hand representation phase, a compact hand representation is computed by averaging the segmented hand regions. The obtained hand representation is referred to as “Hand Energy Image (HEI)”. Quantitative and qualitative analysis show that the proposed hand tracking method is able to predict the hand positions that are closer to the ground truth. Similarly, the proposed HEI hand representation outperforms other methods in the isolated sign language recognition.

Original languageEnglish
Pages (from-to)19917-19944
Number of pages28
JournalMultimedia Tools and Applications
Volume78
Issue number14
DOIs
Publication statusPublished - 30 Jul 2019
Externally publishedYes

Keywords

  • Convolutional Neural Network
  • Hand Energy Image
  • Hand gesture recognition
  • Sign language recognition

Fingerprint

Dive into the research topics of 'Isolated sign language recognition using Convolutional Neural Network hand modelling and Hand Energy Image'. Together they form a unique fingerprint.

Cite this