完整後設資料紀錄
DC 欄位語言
dc.contributor.author温培志en_US
dc.contributor.authorWeng, Pei-Chihen_US
dc.contributor.author王昱舜en_US
dc.contributor.authorWang, Yu-Shuenen_US
dc.date.accessioned2014-12-12T02:36:54Z-
dc.date.available2014-12-12T02:36:54Z-
dc.date.issued2013en_US
dc.identifier.urihttp://140.113.39.130/cdrfb3/record/nctu/#GT070056101en_US
dc.identifier.urihttp://hdl.handle.net/11536/73053-
dc.description.abstractWe present a stroke based system that allows users to retrieve basketball video clips easily and intuitively. In contrast to current retrieval systems, which mainly rely on key words, users can draw player trajectories on our defined basketball court coordinate and specify the corresponding events such as shot made or shot miss to provide a more specific searching condition and prevent unwanted results during retrieval. Considering players are perspectively projected on each video frame and cameras in broadcast videos are dynamic, in which cases the specified strokes and the extracted player trajectories are not comparable, we map player positions in each frame to the defined basketball court coordinate using camera calibration. To achieve a robust mapping, our system considers the whole video clip and reconstruct a panoramic basketball court, followed by rectifying the panoramic court to our defined court using a homography. While this reconstruction is able to map pixels from a video frame to our defined court coordinate, it also is able to map player trajectories between the two coordinate systems. To obtain the event of a video clip, we extract the game time using the optical character recognition and map it to the event logs defined in a play-by-play text that is available online. Thanks to these two types of semantic information, our system is very helpful to coaches and tremendous amount of spectators. The retrieved videos with the corresponding searching conditions shown in \ref{fig:cutin1}-\ref{fig:differentstroke}, and our accompanying video verify the feasibility of our technique.zh_TW
dc.description.abstractWe present a stroke based system that allows users to retrieve basketball video clips easily and intuitively. In contrast to current retrieval systems, which mainly rely on key words, users can draw player trajectories on our defined basketball court coordinate and specify the corresponding events such as shot made or shot miss to provide a more specific searching condition and prevent unwanted results during retrieval. Considering players are perspectively projected on each video frame and cameras in broadcast videos are dynamic, in which cases the specified strokes and the extracted player trajectories are not comparable, we map player positions in each frame to the defined basketball court coordinate using camera calibration. To achieve a robust mapping, our system considers the whole video clip and reconstruct a panoramic basketball court, followed by rectifying the panoramic court to our defined court using a homography. While this reconstruction is able to map pixels from a video frame to our defined court coordinate, it also is able to map player trajectories between the two coordinate systems. To obtain the event of a video clip, we extract the game time using the optical character recognition and map it to the event logs defined in a play-by-play text that is available online. Thanks to these two types of semantic information, our system is very helpful to coaches and tremendous amount of spectators. The retrieved videos with the corresponding searching conditions shown in \ref{fig:cutin1}-\ref{fig:differentstroke}, and our accompanying video verify the feasibility of our technique.en_US
dc.language.isoen_USen_US
dc.subject視頻檢索zh_TW
dc.subject籃球影片處理zh_TW
dc.subject基於筆畫zh_TW
dc.subjectvideo retrievalen_US
dc.subjectbasketball video processingen_US
dc.subjectstroke-baseden_US
dc.title基於筆畫之轉播籃球影片視頻檢索zh_TW
dc.titleStroke-based Broadcast Basketball Video Retrievalen_US
dc.typeThesisen_US
dc.contributor.department資訊科學與工程研究所zh_TW
顯示於類別:畢業論文