Vision and sensor-based signer-independent framework for Arabic sign language recognition / Ahmad Sami Abd Alkareem Al-Shamayleh

Ahmad Sami Abd Alkareem , Al-Shamayleh (2020) Vision and sensor-based signer-independent framework for Arabic sign language recognition / Ahmad Sami Abd Alkareem Al-Shamayleh. PhD thesis, Universiti Malaya.

[img] PDF (The Candidate's Agreement)
Restricted to Repository staff only

Download (229Kb)
    [img] PDF (Thesis PhD)
    Download (4Mb)


      Hearing and speech-impairment disability is widespread throughout the world. At present, 15 million people have this disability in the Arab world, and about 86% of them come from low- and middle-income countries. Meanwhile, sign language (SL) can be classified into standard Arabic sign language (ArSL) and local Arabic sign language (LArSL). ArSL is the formal standard and is the more acceptable SL in the Arab world; it is also considered as the medium of instructions for schools and universities as well as television news, shows and programmes. With the absence of usable ArSL recognition (ArSLR) platforms, hearing- and speech-impaired people tend to be isolated and face serious difficulties in communication and interaction. The focus of this thesis is to design and create a new ArSL and an ArSLR framework, which cover all ArSLR approaches and ArSL forms based on the standard criteria and consensus of SL experts. Essentially, this thesis proposes two frameworks for modelling and developing ArSLR. The first framework represents a usable static backhand and a signer-independent approach for the ArSLR of letters and number signs based on the Vision-Based Recognition (VBR) approach using a smartphone camera. The second framework represents a usable hybrid VBR and sensor-based recognition (SBR) approach for signer-independent continuous ArSLR development and evaluation using Microsoft Kinect and Smart Data Gloves. To accomplish the research objective, an extensive systematic literature review has been conducted to create research taxonomies and to identify the research gaps on ArSLR approaches and ArSL forms of signs. For the first framework, the input signs to the ArSLR framework were first split into open and closed hand signs. Then, the ArSLR framework identified suitable approaches of recognising open and closed hand signs, such as the discrete wavelet transform, which was integrated with 1D-signature signals in closed hand signs. The open hand signs were differentiated through the distribution of quantised area levels, which were generated from the Run-Length Matrix. Statistical analysis approaches were employed to compute the feature descriptors. In our modelled framework, the multi-classification approach was implemented in the recognition phase. The presented framework has the ability to recognise ArSL static backhand finger numbers and letters with a recognition accuracy of 95.63%. The second framework shows how to jointly exploit the VBR and SBR for accurate ArSLR made by the signer-independent. Firstly, a dedicated solution for the combined calibration of the two different devices was proposed. Then, a set of novel feature descriptors was used both for the smart gloves and for Microsoft Kinect. Finally, the proposed feature sets were fed to the multi-class support vector machines. Experimental results show that the obtained accuracies for the Arabic alphabet, Arabic numbers, isolated words, and continuous sentences datasets of this ArSL database are 99.38%, 99.42%, 99.15% and 99.55%, respectively. In conclusion, the created database and the proposed frameworks are able to increase the performance of the ArSLR.

      Item Type: Thesis (PhD)
      Additional Information: Thesis (PhD) – Faculty of Computer Science & Information Technology, Universiti Malaya, 2020.
      Uncontrolled Keywords: Arabic sign language; Vision-based recognition; Sensor-based recognition; Signer-Independent
      Subjects: Q Science > QA Mathematics > QA76 Computer software
      T Technology > T Technology (General)
      Divisions: Faculty of Computer Science & Information Technology
      Depositing User: Mr Mohd Safri Tahir
      Date Deposited: 06 Apr 2023 05:55
      Last Modified: 06 Apr 2023 05:55

      Actions (For repository staff only : Login required)

      View Item