ijaers social
facebook
twitter
Blogger
google plus

International Journal of Advanced Engineering, Management and Science


A Real-Time Letter Recognition Model for Arabic Sign Language Using Kinect and Leap Motion Controller v2

( Vol-2,Issue-5,May 2016 )

Author(s): Miada A. Almasre, Hana Al-Nuaim



Total View : 2612
Downloads : 164
Page No: 514-523
ijaems crossref doiDOI:

Keywords:

Hand gesture recognition, Arabic Sign Language, Kinect version 2, Leap Motion Controller, skeleton.

Abstract:

The objective of this research is to develop a supervised machine learning hand-gesturing model to recognize Arabic Sign Language (ArSL), using two sensors: Microsoft's Kinect with a Leap Motion Controller. The proposed model relies on the concept of supervised learning to predict a hand pose from two depth images and defines a classifier algorithm to dynamically transform gestural interactions based on 3D positions of a hand-joint direction into their corresponding letters whereby live gesturing can be then compared and letters displayed in real time. This research is motivated by the need to increase the opportunity for the Arabic hearing-impaired to communicate with ease using ArSL and is the first step towards building a full communication system for the Arabic hearing impaired that can improve the interpretation of detected letters using fewer calculations. To evaluate the model, participants were asked to gesture the 28 letters of the Arabic alphabet multiple times each to create an ArSL letter data set of gestures built by the depth images retrieved by these devices. Then, participants were later asked to gesture letters to validate the classifier algorithm developed. The results indicated that using both devices for the ArSL model were essential in detecting and recognizing 22 of the 28 Arabic alphabet correctly 100 %.

Cite This Article:
Show All (MLA | APA | Chicago | Harvard | IEEE | Bibtex)
Share: