Lee , Sue Han (2018) Deep plant: A deep learning approach for plant classification / Lee Sue Han. PhD thesis, University of Malaya.
PDF (The Candidate's Agreement) Restricted to Repository staff only Download (1288Kb) | ||
| PDF (Thesis PhD) Download (49Mb) | Preview |
Abstract
Plant classification systems developed by computer vision researchers have helped botanists to recognize and identify unknown plant species more rapidly. Hitherto, the majority of computer vision approaches have been focused on designing sophisticated algorithms to achieve a robust feature representation for plant data. For many morphological leaf features pre-defined by botanists, researchers use hand-engineering approaches for their characterization. They look for the procedures or algorithms that maximize the use of leaf databases for plant predictive modelling, but this results in leaf features which are liable to change with different leaf data and feature extraction techniques. As a solution, the first part of the thesis proposes a novel framework based on Deep Learning (DL) to solve the ambiguities of leaf features that are deemed important for species discrimination. The leaf features are first learned directly from the raw representations of input data using Convolutional Neural Networks (CNN), and then the chosen features are exploited based on a Deconvolutional Network (DN) approach. Besides using solely a single leaf organ to recognize plant species, numerous studies have employed DL methods to solve multi-organ plant classification problem. They focus on generic feature as such the holistic representation of a plant image, disregarding its organ features. In such case, irrelevant features might be erroneously captured especially when they appear to be discriminative for species recognition. Therefore, the second part of the thesis proposes a new hybrid generic-organ CNN architecture. Specifically, it can go beyond the regular generic description of a plant, integrating the organ-specific features together with the generic features to explicitly force the designed network to focus on the organ regions during species classification. Modelling the relationship between different plant views (or organs) is important as these images captured from a same plant share overlapping characteristics which are useful for species recognition. The existing CNN based approaches can only capture the similar region-wise patterns within an image but not the structural patterns of a plant composed of varying number of plant views images composed of one or more organs. The third part of the thesis proposes a novel framework of plant structural learning based on Recurrent Neural Networks (RNN), namely the Plant-StructNet. Specifically, it takes into consideration contextual dependencies between varying plant views capturing one or more organs of a plant and optimizes them for species classification. In summary, the collective impact of the above contributions have constituted to achieve a more practical and feasible framework towards the applications of plant identification. Empirical studies show that the proposed frameworks outperform the state-of-the-art (SOTA) methods in Flavia (S. G. Wu et al., 2007a) and PlantClef2015 plant dataset (Joly et al., 2015). These findings can serve as reference sources for the research community working on plant identification, and also help to support the future work in this area.
Item Type: | Thesis (PhD) |
---|---|
Additional Information: | Thesis (PhD) - Faculty of Computer Science & Information Technology, University of Malaya, 2018. |
Uncontrolled Keywords: | Plant classification; Deep learning; Hybrid generic-organ CNN architecture; Morphological leaf |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Divisions: | Faculty of Computer Science & Information Technology |
Depositing User: | Mr Mohd Safri Tahir |
Date Deposited: | 05 Sep 2018 08:38 |
Last Modified: | 01 Mar 2021 03:48 |
URI: | http://studentsrepo.um.edu.my/id/eprint/8758 |
Actions (For repository staff only : Login required)
View Item |