Full metadata record
DC FieldValueLanguage
dc.contributor.authorAlghofaili, Rawanen_US
dc.contributor.authorSawahata, Yasuhitoen_US
dc.contributor.authorHuang, Haikunen_US
dc.contributor.authorWang, Hsueh-Chengen_US
dc.contributor.authorShiratori, Takaakien_US
dc.contributor.authorYu, Lap-Faien_US
dc.date.accessioned2019-08-02T02:14:46Z-
dc.date.available2019-08-02T02:14:46Z-
dc.date.issued2019-01-01en_US
dc.identifier.isbn978-1-4503-5970-2en_US
dc.identifier.urihttp://dx.doi.org/10.1145/3290605.3300578en_US
dc.identifier.urihttp://hdl.handle.net/11536/152113-
dc.description.abstractA key challenge for virtual reality level designers is striking a balance between maintaining the immersiveness of VR and providing users with on-screen aids after designing a virtual experience. These aids are often necessary for wayfinding in virtual environments with complex paths. We introduce a novel adaptive aid that maintains the effectiveness of traditional aids, while equipping designers and users with the controls of how often help is displayed. Our adaptive aid uses gaze patterns in predicting user's need for navigation aid in VR and displays mini-maps or arrows accordingly. Using a dataset of gaze angle sequences of users navigating a VR environment and markers of when users requested aid, we trained an LSTM to classify user's gaze sequences as needing navigation help and display an aid. We validated the efficacy of the adaptive aid for wayfinding compared to other commonly-used wayfinding aids.en_US
dc.language.isoen_USen_US
dc.subjectGames/Playen_US
dc.subjectVirtual/Augmented Realityen_US
dc.subjectEye Trackingen_US
dc.titleLost in Style: Gaze-driven Adaptive Aid for VR Navigationen_US
dc.typeProceedings Paperen_US
dc.identifier.doi10.1145/3290605.3300578en_US
dc.identifier.journalCHI 2019: PROCEEDINGS OF THE 2019 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMSen_US
dc.citation.spage0en_US
dc.citation.epage0en_US
dc.contributor.department交大名義發表zh_TW
dc.contributor.departmentNational Chiao Tung Universityen_US
dc.identifier.wosnumberWOS:000474467904040en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper