Human face detection from colour images based on multi-skin models, rule-based geometrical knowledge, and artificial neural network / Sinan A. Naji

Naji, Sinan A. (2013) Human face detection from colour images based on multi-skin models, rule-based geometrical knowledge, and artificial neural network / Sinan A. Naji. PhD thesis, University of Malaya.

PDF (Full Text)
Download (14Mb) | Preview


    Automatic human face detection is becoming a critical required step in a wide range of applications such as face recognition systems, face tracking, content-based indexing retrieval systems, communications and teleconferencing, and so on. The first important step of such systems is to locate the face (or faces) within the image. This research produces an efficient state-of-the-art system for the detection of frontal faces from colored images regardless of scale, position, illumination, number of faces and complexity of background. The general architecture of the proposed system consists of three main stages: skin detection, face-center localization, and neural network-based face detector. In the first stage, we use image segmentation techniques to locate human skin color regions in the input image. First, the source image is converted to HSV color space. Then, multi-skin color clustering models are used to detect skin regions in image(s). A total of 24,328,670 training pixels are used to build our skin-models. These pixels are collected manually from true human skin regions using four public databases. The classification boundaries are transformed into a three-dimensional look-up table to speed up the system. Automatic illumination correction step is used for skin color correction to improve the general face appearance. In the second stage, a rule-based geometrical knowledge is employed to examine the presence of face by locating the basic facial features. The goal of this step is to remove false alarms caused by objects with the color that is similar to skin color. First, the facial features are extracted from skin-maps. Then, rule-based geometrical knowledge is employed to describe the human face in order to estimate the location of the “face-center”. In the last stage, neural network-based face detector is used to decide whether a given sub-image window contains a face or not. The neural network-based face detector is applied only to the regions of the image which are marked as candidate face-centers. The classification phase consists of four steps: the cropper, histogram equalizer, texture-analyzer, and ANN-based classifier. The function of the cropper is to crop a sub-image’s pyramid from the source image. Histogram equalizer is used to improve the contrast. Texture-analyzer is used to compute texture descriptors. Training neural network is done offline and designed to be general with minimum customization. A total of 40,000 face and non-face images are collected for training the ANN-based classifier. The implementation of different methodologies in one integrated system, where one method can compensate for the weaknesses of another, depicts reasonably accurate results. The system has been trained, tested and evaluated using five public databases which contain faces of different sizes, ethnicities, lighting conditions, and cluttered backgrounds. Comparison with state-of-the-art methods is presented, indicating that our system shows viable detection performance.

    Item Type: Thesis (PhD)
    Additional Information: Thesis (Ph.D) -- Jabatan Kepintaran Buatan, Fakulti Sains Komputer dan Teknologi Maklumat, Universiti Malaya, 2013
    Uncontrolled Keywords: Human face recognition (Computer science); Face perception--Data processing; Image processing--Digital techniques; Neural networks (Computer science)
    Subjects: Q Science > QA Mathematics > QA76 Computer software
    T Technology > T Technology (General)
    Divisions: Faculty of Computer Science & Information Technology > Dept of Artificial Intelligence
    Depositing User: Miss Dashini Harikrishnan
    Date Deposited: 19 Jun 2015 13:20
    Last Modified: 19 Jun 2015 13:20

    Actions (For repository staff only : Login required)

    View Item