Full metadata record
DC FieldValueLanguage
dc.contributor.authorWu, BFen_US
dc.contributor.authorLin, SPen_US
dc.contributor.authorChen, YHen_US
dc.date.accessioned2014-12-08T15:25:37Z-
dc.date.available2014-12-08T15:25:37Z-
dc.date.issued2005en_US
dc.identifier.isbn0-7803-9313-9en_US
dc.identifier.urihttp://hdl.handle.net/11536/18015-
dc.description.abstractThe proposed multiple-vehicle detection and tracking (MVDT) system utilizes a color background to segment moving objects and exploits relations among the moving objects and existed trajectories to track vehicles. Initially, the background is extracted by classification. Then, it is regularly updated by previous moving objects to guarantee robust segmentation in luminance-change circumstance. For partial wrong converged background due to roadside parking vehicles. it will be corrected later by checking fed back trajectories to avoid false detection after the vehicles moving away. In tracking processing, the relations of distances or distances and angles are applied to determine whether to create, extend, and delete a trajectory. If occlusion detected after trajectory creation, it will be resolved by rule-based tracking reasoning. Otherwise, lane information will be used. Finally, traffic parameter calculations based on the trajectories art, listed. Moreover, for easy setup, parameter automation for the system is proposed.en_US
dc.language.isoen_USen_US
dc.subjectdetectionen_US
dc.subjectsegmentationen_US
dc.subjecttrackingen_US
dc.subjectocclusionen_US
dc.subjectrule-based reasoningen_US
dc.subjecttraffic parameteren_US
dc.titleA real-time multiple-vehicle detection and tracking system with prior occlusion detection and resolutionen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2005 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Vols 1 and 2en_US
dc.citation.spage311en_US
dc.citation.epage316en_US
dc.contributor.department電控工程研究所zh_TW
dc.contributor.departmentInstitute of Electrical and Control Engineeringen_US
dc.identifier.wosnumberWOS:000236568000055-
Appears in Collections:Conferences Paper