Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chiu, Pei-Hsuan | en_US |
dc.contributor.author | Tseng, Po-Hsuan | en_US |
dc.contributor.author | Feng, Kai-Ten | en_US |
dc.date.accessioned | 2019-04-02T05:59:46Z | - |
dc.date.available | 2019-04-02T05:59:46Z | - |
dc.date.issued | 2018-10-01 | en_US |
dc.identifier.issn | 0018-9545 | en_US |
dc.identifier.uri | http://dx.doi.org/10.1109/TVT.2018.2864893 | en_US |
dc.identifier.uri | http://hdl.handle.net/11536/148342 | - |
dc.description.abstract | In recent years, augmented reality (AR) is considered a promising technology that combines virtual information such as videos, images, and three-dimensional objects with a real camera view in mobile platforms. Interactive AR further provides human-computer interaction to allow the user to interact with virtual objects on the mobile display. In this paper, we proposed a cloud-based mobile augmented reality interactive system (MARIS), which includes MARIS-I for image target tracking and MARIS-H for hand motion tracking. MARIS-I estimates the position of the image target by adopting a feature-based mean-shift algorithm, which is feasible for real-time applications with its small region feature detection. MARIS-H provides two tracking modes for fingertip and hack of hand tracking to enhance user experiences (UX) for interaction. Either the center position for the back of hand or fingertip is first estimated by particle filtering technique, which calculates the weighting of each particle according to hand or fingertip model. Afterward, the contour of the fingertip is estimated by level-set-based contour evolution in the fingertip tracking mode. Furthermore, we implement a device/cloud architecture for the proposed MARIS to decrease memory requirement and computational complexity on the device side. Experimental results show that MARIS including MARIS-I and MARIS-H can outperform other existing methods for image and hand motion tracking, respectively. The proposed MARIS is demonstrated in a picture book to provide fruitful interactive UX for digital learning systems. | en_US |
dc.language.iso | en_US | en_US |
dc.subject | Augmented reality | en_US |
dc.subject | motion tracking | en_US |
dc.subject | particle filtering | en_US |
dc.title | Interactive Mobile Augmented Reality System for Image and Hand Motion Tracking | en_US |
dc.type | Article | en_US |
dc.identifier.doi | 10.1109/TVT.2018.2864893 | en_US |
dc.identifier.journal | IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY | en_US |
dc.citation.volume | 67 | en_US |
dc.citation.spage | 9995 | en_US |
dc.citation.epage | 10009 | en_US |
dc.contributor.department | 電機工程學系 | zh_TW |
dc.contributor.department | Department of Electrical and Computer Engineering | en_US |
dc.identifier.wosnumber | WOS:000447853300075 | en_US |
dc.citation.woscount | 0 | en_US |
Appears in Collections: | Articles |