Henry Friday , Nweke (2019) Multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / Henry Friday Nweke. PhD thesis, University of Malaya.
PDF (The Candidate's Agreement) Restricted to Repository staff only Download (190Kb) | |
PDF (Thesis PhD) Restricted to Repository staff only until 31 December 2021. Download (2515Kb) |
Abstract
Human activity detection through fusion of multimodal sensors are vital steps to achieve automatic and comprehensive monitoring of human behaviours, build smart home systems and detect sports activities. In addition, human activity detection methods have wide applications in security, surveillance and postural detection to prevent falls in elderly. The proliferation of sensor embedded devices such as wearable sensors, ambient environments and smartphones have significantly facilitated automatic and ubiquitous collection of sensor data for analysis of human activities details. Over the years, various machine learning methods have been proposed to analyse collected sensor data to infer certain activity details. However, analysis of mobile and wearable sensor data for human activity detection is still very challenging. This is further worsen by the use of single sensors modality and machine learning algorithms. Furthermore, developing robust and efficient methods are required to handle issues such as orientation and position displacement, sensor fusion and feature incompatibility, automatic feature representation, and how to minimize intra-class similarity and inter-class variability. Hence, to solve the above issues, different objectives were formulated. First, to investigate existing multi-sensor and automatic feature extraction methods for human activity detection and health monitoring using motion sensor. Second, to propose multi-sensor fusion based on multiple classifier system to reduce activity misrecognition and feature incompatibility. Third, to propose orientation invariant based deep spare autoencoder methods for automatic complex activity identification to minimize orientation inconsistencies and learn adequate data patterns. Furthermore, to confirm the performances of the proposed multi-sensor fusion methods using challenging motion sensor data generated using smartphones and wearable sensors. Finally, compare the performances with existing multi-sensor fusion and feature extraction methods for human activity detection and health monitoring. Experimental results demonstrate the capability of the proposed multi-sensor fusion through multiple classifier systems and orientation invariant based deep learning methods for human activity detection and health monitoring. In the first objective that utilize multi-sensor fusion and multiple classifier systems, the proposed method improves 3% to 27% over single sensor modality and feature-level fusion. In the second method utilizing deep learning and orientation invariant features for human activity detection. The proposed automatic feature representation method outperforms existing methods and obtain 97.3%, 97.13% and 99.76% accuracy, recall and sensitivity respectively. In addition, the proposed automatic feature representation method achieved average detection accuracy between 2% to 51% compared to existing methods. The proposed methods can be implemented for accurate monitoring and early detection of activity details using wide range of sensors in mobile and wearable devices.
Item Type: | Thesis (PhD) |
---|---|
Additional Information: | Thesis (PhD) – Faculty of Computer Science & Information Technology, University of Malaya, 2019. |
Uncontrolled Keywords: | Multi-sensor fusion; Motion sensor data; Smartphones; Deep spare autoencoder methods; Human activity detection |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Divisions: | Faculty of Computer Science & Information Technology |
Depositing User: | Mr Mohd Safri Tahir |
Date Deposited: | 19 May 2020 01:24 |
Last Modified: | 19 May 2020 01:24 |
URI: | http://studentsrepo.um.edu.my/id/eprint/11162 |
Actions (For repository staff only : Login required)
View Item |