完整後設資料紀錄
DC 欄位語言
dc.contributor.authorLai, Yu-Chunen_US
dc.contributor.authorLiao, Hong-Yuan Marken_US
dc.contributor.authorLin, Cheng-Chungen_US
dc.date.accessioned2014-12-08T15:10:32Z-
dc.date.available2014-12-08T15:10:32Z-
dc.date.issued2007en_US
dc.identifier.isbn978-3-540-77254-5en_US
dc.identifier.issn0302-9743en_US
dc.identifier.urihttp://hdl.handle.net/11536/8046-
dc.description.abstractWe propose an intrinsic-distance based segmentation approach for segmenting human body parts in video frames. First, since the human body can be seen as a set of articulated parts, we utilize the moving articulated attributes to identify body part candidate regions automatically. The candidate regions and the background candidate regions are generated by voting and assigned to the spatiotemporal volume, which is comprised of frames of the video. Then, the intrinsic distance is used to estimate the boundaries of each body part. Our intrinsic distance-based segmentation technique is applied in the spatiotemporal volume to extract the optimal boundaries of the intrinsic distance in a video and obtain segmented frames from the segmented volume. The segmented results show that the proposed approach can tolerate incomplete and imprecise candidate regions because it provides temporal continuity. Furthermore, it can reduce over growing in the original intrinsic distance-based algorithm, since it can handle ambiguous pixels. We expect that this research can provide an alternative to segmenting a sequence of body parts in a video.en_US
dc.language.isoen_USen_US
dc.subjectsegmentationen_US
dc.subjecthuman body parten_US
dc.subjectintrinsic distanceen_US
dc.titleSegmentation of human body parts in video frames based on intrinsic distanceen_US
dc.typeProceedings Paperen_US
dc.identifier.journalADVANCES IN MULTIMEDIA INFORMATION PROCESSING - PCM 2007en_US
dc.citation.volume4810en_US
dc.citation.spage450en_US
dc.citation.epage453en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000253081100057-
顯示於類別:會議論文