完整後設資料紀錄
DC 欄位語言
dc.contributor.authorHo, Yung-Hanen_US
dc.contributor.authorCho, Chuan-Yuanen_US
dc.contributor.authorPeng, Wen-Hsiaoen_US
dc.contributor.authorJin, Guo-Lunen_US
dc.date.accessioned2020-10-05T02:01:30Z-
dc.date.available2020-10-05T02:01:30Z-
dc.date.issued2019-01-01en_US
dc.identifier.isbn978-1-7281-4803-8en_US
dc.identifier.issn1550-5499en_US
dc.identifier.urihttp://dx.doi.org/10.1109/ICCV.2019.01056en_US
dc.identifier.urihttp://hdl.handle.net/11536/155286-
dc.description.abstractThis paper leverages a classic prediction technique, known as parametric overlapped block motion compensation (POBMC), in a reinforcement learning framework for video prediction. Learning-based prediction methods with explicit motion models often suffer from having to estimate large numbers of motion parameters with artificial regularization. Inspired by the success of sparse motion-based prediction for video compression, we propose a parametric video prediction on a sparse motion field composed of few critical pixels and their motion vectors. The prediction is achieved by gradually refining the estimate of a future frame in iterative, discrete steps. Along the way, the identification of critical pixels and their motion estimation are addressed by two neural networks trained under a reinforcement learning setting. Our model achieves the state-of-the-art performance on CaltchPed, UCF101 and CIF datasets in one-step and multi-step prediction tests. It shows good generalization results and is able to learn well on small training data.en_US
dc.language.isoen_USen_US
dc.titleSME-Net: Sparse Motion Estimation for Parametric Video Prediction through Reinforcement Learningen_US
dc.typeProceedings Paperen_US
dc.identifier.doi10.1109/ICCV.2019.01056en_US
dc.identifier.journal2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019)en_US
dc.citation.spage10461en_US
dc.citation.epage10469en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000548549205059en_US
dc.citation.woscount0en_US
顯示於類別:會議論文