Testing the minimal bounded space method on vision-based drone navigation / Yap Seng Kuang

Yap , Seng Kuang (2021) Testing the minimal bounded space method on vision-based drone navigation / Yap Seng Kuang. Masters thesis, Universiti Malaya.

[img] PDF (The Candidate's Agreement)
Restricted to Repository staff only

Download (190Kb)
    [img] PDF (Thesis M.A.)
    Download (3447Kb)


      The object-based approach is the most common in developing navigation strategies for robots. The object-based approach focuses on segmentation, detection, annotation, and recognition of objects or markers in the scene. For a drone, this approach is popular with the utility of sensors such as laser, vision (mono and stereo), ultrasonic, Kinect, and others. To elicit the information required, often, the object-based approach relies on these sensors as a hybrid solution. Recently, the availability of the deep learning algorithm also encourages the object-based approach for drone navigation. A critical gap in the object-based approach is that the computational resources required are massive. For a drone, especially a low-cost one, this renders the object-based approach to simulation-based only works. The other, less common navigation strategy for robots is the space-based approach. In the space-based approach, there is no object learning. Without object learning, there is no advanced processing to perform object recognition or labeling. The space-based approach is focused on computing the openings in the surrounding space. Recent works have experimented with the space-based approach for robot navigation (Azizul, 2013; Azizul & Khanil, 2017). The space-based method used is called the Minimum Bounded Space (MBS). The name of the method is obtained from trying to bound the spatial openings immediately to the robot. In the earlier work, Azizul (2013) tested the MBS on a mobile robot equipped with a laser sensor. There is no imaging involved, but the laser sensor does record depth information. The spatial openings are derived by analyzing occlusion information from the environment, which is available from the depth information. The laser robot is shown to navigate autonomously by moving from one spatial opening to another in an indoor environment. In the later work, Azizul & Khanil (2017) experimented with the MBS on a mobile robot equipped with a camera. Imaging is involved, but the way they processed the image is not the same as the processes involved in object-based works. Floor segmentation and analysis become the basis for finding spatial openings immediately to the robot. Eliciting the openings in the indoor environment is achieved without depth information. The results shown from these prior works are encouraging as they do not require complex processing. Furthermore, they show how MBS is successfully implemented for real-time robot experiments. Interestingly, they show the versatility of the MBS method for autonomous robot navigation with or without depth information. However, the MBS method has not been tested on a flying robot or the outdoor environment. In this work, I show how the MBS method is implemented as the navigation strategy for a low-cost drone. The drone used in this work is called the Parrot Bebop Drone, which is equipped with a camera on board. To complete this task, I have developed a new computer vision framework to elicit openings for the MBS. The testing done shows the MBS is useful for low-cost drones flying in an indoor and outdoor

      Item Type: Thesis (Masters)
      Additional Information: Dissertation (M.A.) – Faculty of Computer Science & Information Technology, Universiti Malaya, 2021.
      Uncontrolled Keywords: Cognitive robotics; Robot navigation; Spatial representation; Micro aerial vehicle
      Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
      Divisions: Faculty of Computer Science & Information Technology
      Depositing User: Mr Mohd Safri Tahir
      Date Deposited: 15 Feb 2023 06:40
      Last Modified: 15 Feb 2023 06:40
      URI: http://studentsrepo.um.edu.my/id/eprint/14137

      Actions (For repository staff only : Login required)

      View Item