Full metadata record
DC FieldValueLanguage
dc.contributor.authorChou, Chien-Lien_US
dc.contributor.authorChen, Hua-Tsungen_US
dc.contributor.authorHsu, Chun-Chiehen_US
dc.contributor.authorHo, Chien-Pengen_US
dc.contributor.authorLee, Suh-Yinen_US
dc.date.accessioned2015-12-02T03:00:58Z-
dc.date.available2015-12-02T03:00:58Z-
dc.date.issued2014-01-01en_US
dc.identifier.isbn978-1-4799-4761-4en_US
dc.identifier.issn1945-7871en_US
dc.identifier.urihttp://hdl.handle.net/11536/128626-
dc.description.abstractWith the explosive growth of the social multimedia sharing, copyright protection and search result refinement are always the critical issues for the service operators. To resolve the problems, content-based near-duplicate video retrieval is developed in recent years. In this paper, we construct a condensed Pattern-based Prefix tree (PP-tree) to index the patterns of reference videos for fast retrieval. To calculate how likely a query video and a reference video are near-duplicates, a novel algorithm for discovering the temporal relations among patterns is proposed. Comprehensive experiments on public datasets are conducted to verify the effectiveness and efficiency of the proposed method. Experimental results show that the proposed near-duplicate video retrieval approach outperforms the state-of-the-art approaches in terms of precision, recall, and execution time.en_US
dc.language.isoen_USen_US
dc.subjectNear-duplicate video retrievalen_US
dc.subjectvideo copy detectionen_US
dc.subjectpattern matchingen_US
dc.subjectprefix treeen_US
dc.subjectvideo retrievalen_US
dc.titleNEAR-DUPLICATE VIDEO RETRIEVAL BY USING PATTERN-BASED PREFIX TREE AND TEMPORAL RELATION FORESTen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2014 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME)en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000360831800174en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper