Please use this identifier to cite or link to this item:
http://repository.elizadeuniversity.edu.ng/jspui/handle/20.500.12398/509
Title: | Modeling of Human Upper Body for Sign Language Recognition |
Authors: | Bilal, Sara Akmeliawati, Rini Shafie, Amir A. Salami, Momoh-Jimoh E |
Keywords: | Human upper body detection Scaling Tracking using CAMSHIFT Sign Language Recognition |
Issue Date: | 6-Dec-2011 |
Publisher: | IEEE |
Citation: | Bilal, S., Akmeliawati, R., Shafie, A. A., & El Salami, M. J. (2011, December). Modeling of Human Upper Body for Sign Language Recognition. In The 5th International Conference on Automation, Robotics and Applications (pp. 104-108). IEEE. |
Abstract: | Sign Language Recognition systems require not only the hand motion trajectory to be classified but also facial features, Human Upper Body (HUB) and hand position with respect to other HUB parts. Head, face, forehead, shoulders and chest are very crucial parts that can carry a lot of positioning information of hand gestures in gesture classification. In this paper as the main contribution, a fast and robust search algorithm for HUB parts based on head size has been introduced for real time implementations. Scaling the extracted parts during body orientation was attained using partial estimation of face size. Tracking the extracted parts for front and side view was achieved using CAMSHIFT [24]. The outcome of the system makes it applicable for real-time applications such as Sign Languages Recognition (SLR) systems. |
URI: | http://repository.elizadeuniversity.edu.ng/jspui/handle/20.500.12398/509 |
Appears in Collections: | Research Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Modeling of Human Upper Body for Sign Language Recognition.pdf | Article full-text | 344.92 kB | Adobe PDF | View/Open |
Items in EUSpace are protected by copyright, with all rights reserved, unless otherwise indicated.