TY - GEN
T1 - Virtual input using skin unit color model for robotic platform control
AU - Nordin, Nadira
AU - Arshad, M. R.
AU - Soori, U.
AU - Yin, N. M.
PY - 2009
Y1 - 2009
N2 - The aim of this research effort is to build a robust image-based virtual input for robotic platform control. Compared to the motion of the head, eye gaze, face or even the whole body, tracking of hand movement has become increasingly popular, efficient and more natural in human computer interaction. However available systems are invasive and require the user to wear gloves or markers. In this project, we propose a markerless hand tracking system as a virtual input device. We track the hand movement using skin color as detection cue because color allows fast processing, and is highly robust to geometric variations of hand pattern [1]. The skin segmentation are modeled by a parametric skin modeling using single Gaussian method where the mean and covariance of chrominant color are calculated and the Mahalanobis distance to classified skin and non-skin threshold are measured. Then by using blob analysis technique, the centroid value is extracted and uses it as the position of the hand. Robotic platform can be controlled by a set of six instructions including stop, start, forward, backward, left and right based on the movement of the hand.
AB - The aim of this research effort is to build a robust image-based virtual input for robotic platform control. Compared to the motion of the head, eye gaze, face or even the whole body, tracking of hand movement has become increasingly popular, efficient and more natural in human computer interaction. However available systems are invasive and require the user to wear gloves or markers. In this project, we propose a markerless hand tracking system as a virtual input device. We track the hand movement using skin color as detection cue because color allows fast processing, and is highly robust to geometric variations of hand pattern [1]. The skin segmentation are modeled by a parametric skin modeling using single Gaussian method where the mean and covariance of chrominant color are calculated and the Mahalanobis distance to classified skin and non-skin threshold are measured. Then by using blob analysis technique, the centroid value is extracted and uses it as the position of the hand. Robotic platform can be controlled by a set of six instructions including stop, start, forward, backward, left and right based on the movement of the hand.
KW - Gesture recognition
KW - HCI
KW - Platform control
KW - Skin color
KW - Virtual input
UR - http://www.scopus.com/inward/record.url?scp=77954485022&partnerID=8YFLogxK
U2 - 10.1109/ICSIPA.2009.5478666
DO - 10.1109/ICSIPA.2009.5478666
M3 - Conference Proceeding
AN - SCOPUS:77954485022
SN - 9781424455614
T3 - ICSIPA09 - 2009 IEEE International Conference on Signal and Image Processing Applications, Conference Proceedings
SP - 305
EP - 311
BT - ICSIPA09 - 2009 IEEE International Conference on Signal and Image Processing Applications, Conference Proceedings
T2 - 2009 IEEE International Conference on Signal and Image Processing Applications, ICSIPA09
Y2 - 18 November 2009 through 19 November 2009
ER -