Towards a Real-Time American Sign Language Typing Interface Using Static and Dynamic Hand Gustures

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

Abstract

This paper presents a vision-based sign language recognition system designed to support real-time communication between deaf-mute individuals and digital interfaces or robotic agents. The system focuses on three primary modules: static and dynamic single-hand gesture recognition, dual-hand static gesture recognition, and an interactive user interface. Static gestures, covering 24 ASL alphabet letters (excluding J and Z) and two functional commands (YES and NO), are classified using a fine-tuned VGG16 convolutional neural network, which achieved a validation accuracy of 92.17%. For dynamic gestures, specifically J and Z, a Long Short-Term Memory (LSTM) model processes landmark trajectories extracted via MediaPipe, achieving a validation accuracy of 91.67%. The dual-hand recognition component, trained separately using the same CNN architecture, accurately detects more complex gestures such as the word "book." All models are integrated into a multithreaded Python-based user interface that supports live video input, on-screen gesture tracking, dynamic gesture triggering, and buffered text output. Testing results show high classification accuracy across all gesture categories, with real-time performance and consistent recognition even under variable lighting and hand positions. Overall, the system demonstrates a scalable and accessible approach to gesture-based interaction, providing foundational work for future integration with robotic systems or service-oriented applications for deaf and mute users.
Original languageEnglish
Title of host publicationProceedings of 2025 8th International Conference on Big Data and Artificial Intelligence
Place of PublicationChina
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages77-82
Number of pages6
ISBN (Print)979-8-3503-9252-4, 979-8-3503-9251-7
DOIs
Publication statusPublished - 13 Jan 2026

Fingerprint

Dive into the research topics of 'Towards a Real-Time American Sign Language Typing Interface Using Static and Dynamic Hand Gustures'. Together they form a unique fingerprint.

Cite this