Full metadata record
DC FieldValueLanguage
dc.contributor.authorChang, Wei-Chunen_US
dc.contributor.authorWu, Cheng-Weien_US
dc.contributor.authorTsai, Richard Yi-Chiaen_US
dc.contributor.authorLin, Kate Ching-Juen_US
dc.contributor.authorTseng, Yu-Cheeen_US
dc.date.accessioned2019-04-02T06:04:13Z-
dc.date.available2019-04-02T06:04:13Z-
dc.date.issued2018-01-01en_US
dc.identifier.issn1050-4729en_US
dc.identifier.urihttp://hdl.handle.net/11536/150767-
dc.description.abstractPerson identification (PID) is a key issue in many IoT applications. It has long been studied and achieved by technologies such as RFID and face/fingerprint/iris recognition. These approaches, however, have their limitations due to environmental constraints (such as lighting and obstacles) or require close contact to specific devices. Therefore, their recognition rates highly depend on use scenarios. To enable reliable and remote PID, in this work, we present EOY (Eye On You) 1, a data fusion approach that combines two kinds of sensors, a 3D depth camera and wearable sensors embedded with inertial measurement unit (IMU). Since these two kinds of data share common features, we are able to fuse them to conduct PID. Further, the result can be transferred to a mobile platform (such as robot) since we have less constraints on devices. To realize EOY, we develop fusion algorithms to address practical challenges, such as asynchronous timing and coordinate calibration. The experimental evaluation shows that EOY can achieve the recognition rate of 95% and is very robust even in crowded areas.en_US
dc.language.isoen_USen_US
dc.titleEye On You: Fusing Gesture Data from Depth Camera and Inertial Sensors for Person Identificationen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA)en_US
dc.citation.spage2021en_US
dc.citation.epage2026en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000446394501085en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper