標題: | Automatic segmentation and summarization for videos taken with smart glasses |
作者: | Chiu, Yen-Chia Liu, Li-Yi Wang, Tsaipei 資訊工程學系 Department of Computer Science |
關鍵字: | Google Glass;Smart glasses;Egocentric video;Video abstraction;Video segmentation;Video summarization;Video diary |
公開日期: | 1-五月-2018 |
摘要: | This paper discusses the topic of automatic segmentation and extraction of important segments of videos taken with Google Glasses. Using the information from both the video images and additional sensor data that are recorded concurrently, we devise methods that automatically divide the video into coherent segments and estimate the importance of the each segment. Such information then enables automatic generation of video summary that contains only the important segments. The features used include colors, image details, motions, and speeches. We then train multi-layer perceptrons for the two tasks (segmentation and importance estimation) according to human annotations. We also present a systematic evaluation procedure that compares the automatic segmentation and importance estimation results with those given by multiple users and demonstrate the effectiveness of our approach. |
URI: | http://dx.doi.org/10.1007/s11042-017-4910-8 http://hdl.handle.net/11536/145037 |
ISSN: | 1380-7501 |
DOI: | 10.1007/s11042-017-4910-8 |
期刊: | MULTIMEDIA TOOLS AND APPLICATIONS |
Volume: | 77 |
起始頁: | 12679 |
結束頁: | 12699 |
顯示於類別: | 期刊論文 |