完整後設資料紀錄
DC 欄位語言
dc.contributor.authorChou, Kuang-Penen_US
dc.contributor.authorPrasad, Mukeshen_US
dc.contributor.authorPuthal, Deepaken_US
dc.contributor.authorChen, Ping-Hungen_US
dc.contributor.authorVishwakarma, Dinesh Kumaren_US
dc.contributor.authorSundaram, Sureshen_US
dc.contributor.authorLin, Chin-Tengen_US
dc.contributor.authorLin, Wen-Chiehen_US
dc.date.accessioned2018-08-21T05:57:14Z-
dc.date.available2018-08-21T05:57:14Z-
dc.date.issued2017-01-01en_US
dc.identifier.urihttp://hdl.handle.net/11536/147211-
dc.description.abstractThis paper proposes a novel Fast Deformable Model for Pedestrian Detection (FDMPD) to detect the pedestrians efficiently and accurately in the crowded environment. Despite of multiple detection methods available, detection becomes difficult due to variety of human postures and perspectives. The proposed study is divided into two parts. First part trains six Adaboost classifiers with Haar-like feature for different body parts (e.g., head, shoulders, and knees) to build the response feature maps. Second part uses these six response feature maps with full-body model to produce spatial deep features. The combined deep features are used as an input to SVM to judge the existence of pedestrian. As per the experiments conducted on the INRIA person dataset, the proposed FDMPD approach shows greater than 44.75 % improvement compared to other state-of-the-art methods in terms of efficiency and robustness.en_US
dc.language.isoen_USen_US
dc.subjectPedestrianen_US
dc.subjectAdaboosten_US
dc.subjectMulti-viewen_US
dc.subjectDeformable part modelen_US
dc.titleFast Deformable Model for Pedestrian Detection with Haar-like Featuresen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2017 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI)en_US
dc.citation.spage259en_US
dc.citation.epage266en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.department電控工程研究所zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.contributor.departmentInstitute of Electrical and Control Engineeringen_US
dc.identifier.wosnumberWOS:000428251400037en_US
顯示於類別:會議論文