完整後設資料紀錄
DC 欄位語言
dc.contributor.authorChen, Yen-Linen_US
dc.contributor.authorLiang, Wen-Yewen_US
dc.contributor.authorChiang, Chuan-Yenen_US
dc.contributor.authorHsieh, Tung-Juen_US
dc.contributor.authorLee, Da-Chengen_US
dc.contributor.authorYuan, Shyan-Mingen_US
dc.contributor.authorChang, Yang-Langen_US
dc.date.accessioned2014-12-08T15:30:32Z-
dc.date.available2014-12-08T15:30:32Z-
dc.date.issued2011-07-01en_US
dc.identifier.issn1424-8220en_US
dc.identifier.urihttp://dx.doi.org/10.3390/s110706868en_US
dc.identifier.urihttp://hdl.handle.net/11536/21816-
dc.description.abstractThis study presents efficient vision-based finger detection, tracking, and event identification techniques and a low-cost hardware framework for multi-touch sensing and display applications. The proposed approach uses a fast bright-blob segmentation process based on automatic multilevel histogram thresholding to extract the pixels of touch blobs obtained from scattered infrared lights captured by a video camera. The advantage of this automatic multilevel thresholding approach is its robustness and adaptability when dealing with various ambient lighting conditions and spurious infrared noises. To extract the connected components of these touch blobs, a connected-component analysis procedure is applied to the bright pixels acquired by the previous stage. After extracting the touch blobs from each of the captured image frames, a blob tracking and event recognition process analyzes the spatial and temporal information of these touch blobs from consecutive frames to determine the possible touch events and actions performed by users. This process also refines the detection results and corrects for errors and occlusions caused by noise and errors during the blob extraction process. The proposed blob tracking and touch event recognition process includes two phases. First, the phase of blob tracking associates the motion correspondence of blobs in succeeding frames by analyzing their spatial and temporal features. The touch event recognition process can identify meaningful touch events based on the motion information of touch blobs, such as finger moving, rotating, pressing, hovering, and clicking actions. Experimental results demonstrate that the proposed vision-based finger detection, tracking, and event identification system is feasible and effective for multi-touch sensing applications in various operational environments and conditions.en_US
dc.language.isoen_USen_US
dc.subjectmulti-touch sensingen_US
dc.subjectcomputer visionen_US
dc.subjectfinger detectionen_US
dc.subjectfinger trackingen_US
dc.subjectmulti-touch event identificationen_US
dc.titleVision-Based Finger Detection, Tracking, and Event Identification Techniques for Multi-Touch Sensing and Display Systemsen_US
dc.typeArticleen_US
dc.identifier.doi10.3390/s110706868en_US
dc.identifier.journalSENSORSen_US
dc.citation.volume11en_US
dc.citation.issue7en_US
dc.citation.spage6868en_US
dc.citation.epage6892en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000293069200023-
dc.citation.woscount5-
顯示於類別:期刊論文


文件中的檔案:

  1. 000293069200023.pdf

若為 zip 檔案,請下載檔案解壓縮後,用瀏覽器開啟資料夾中的 index.html 瀏覽全文。