Full metadata record
DC FieldValueLanguage
dc.contributor.authorLo, Kuo-Huaen_US
dc.contributor.authorWang, Chih-Jungen_US
dc.contributor.authorChuang, Jen-Huien_US
dc.contributor.authorChen, Hua-Tsungen_US
dc.date.accessioned2014-12-08T15:36:54Z-
dc.date.available2014-12-08T15:36:54Z-
dc.date.issued2012-01-01en_US
dc.identifier.isbn978-4-9906441-0-9; 978-1-4673-2216-4en_US
dc.identifier.issn1051-4651en_US
dc.identifier.urihttp://hdl.handle.net/11536/25294-
dc.description.abstractWith the popularity of vision-based camera surveillance, the research on people localization appeals to much attention. In this paper, we propose an efficient and effective system capable of locating a crowd of dense people in real time, using multiple cameras. For each camera view, sample lines, originated from a vanishing point, of foreground objects are projected on the ground plane. Ground regions containing a high density of projected lines are then used to find people locations. Enhanced from previous works, the people localization approach proposed in this paper needs not project all foreground pixels of all views to multiple reference planes or compute pairwise intersections of projected sample lines at different heights, resulting in significant improvement in computational efficiency. Furthermore, the people heights can also be estimated. Experimental results on real surveillance scenes show that comparable accuracy in people localization can be achieved with five times in computing speed compared with our previous approach.en_US
dc.language.isoen_USen_US
dc.titleAcceleration of Vanishing Point-Based Line Sampling Scheme for People Localization and Height Estimation via 3D Line Samplingen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012)en_US
dc.citation.volumeen_US
dc.citation.issueen_US
dc.citation.spage2788en_US
dc.citation.epage2791en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000343660602214-
Appears in Collections:Conferences Paper