#45: Real-time robust detection and extraction of hand gestures for HCI


Y. Zhu, K. Palaniappan, H. Jiang, Y. Zhao, X. Zhuang, and G. Xu

IASTED 5th Int. Conf. Computer Graphics and Imaging (CGIM'02), pgs. 68--73, 2002

motion, visual events, tracking, data mining

PlainText, Bibtex, PDF, URL, DOI, Google Scholar

Abstract

Aiming at the use of hand gestures for human–computer interaction, this paper presents a real-time approach to the spotting, representation, and recognition of hand gestures from a video stream. The approach exploits multiple cues including skin color, hand motion, and shape. Skin color analysis and coarse image motion detection are joined to perform reliable hand gesture spotting. At a higher level, a compact spatiotemporal representation is proposed for modeling appearance changes in image sequences containing hand gestures. The representation is extracted by combining robust parameterized image motion regression and shape features of a segmented hand. For efficient recognition of gestures made at varying rates, a linear resampling technique for eliminating the temporal variation (time normalization) while main- taining the essential information of the original gesture representations is developed. The gesture is then classified according to a training set of gestures. In experiments with a library of 12 gestures, the recognition rate was over 90%. Through the devel- opment of a prototype gesture-controlled panoramic map browser, we demonstrate that a vocabulary of predefined hand gestures can be used to interact successfully with applications running on an off-the-shelf personal computer equipped with a home video camera.