標題: | Enabling Identity-Aware Tracking via Fusion of Visual and Inertial Features |
作者: | Tsai, Richard Yi-Chia Ke, Hans Ting-Yuan Lin, Kate Ching-Ju Tseng, Yu-Chee 資訊工程學系 Department of Computer Science |
關鍵字: | Computer Vision;Data Fusion;IoT;Person Identification;Tracking;Robotics;Wearable Computing |
公開日期: | 1-一月-2019 |
摘要: | Person identification and tracking (PIT) is an essential issue in computer vision and robotic applications. It has long been studied and achieved by technologies such as RFID or face/fingerprint/iris recognition. These approaches, however, have their limitations due to environmental constraints (such as lighting and obstacles) or require close contact to specific devices. Therefore, their recognition accuracy highly depends on use scenarios. In this work, we propose RCU (Robot Catch yoU), an accompanyist robot system that provides follow-me or guide-me services. Such robots are capable of distinguishing users' profiles in front of them and keep tracking a specific target person. We study a more challenging scenario where the target person may be under occlusion from time to time. To enable robust PIT, we develop a data fusion technique that integrates two types of sensors, an RGB-D camera and wearable inertial sensors. Since the data generated by these sensors share common features, we are able to fuse them to achieve identity-aware tracking. Practical issues, such as time synchronization and coordinate calibration, are also addressed. We implement our design on a robotic platform and show that it can track a target person even when no biological feature is captured by the RGB-D camera. Our experimental evaluation shows a recognition rate of 95% and a following rate of 88%. |
URI: | http://hdl.handle.net/11536/153331 |
ISBN: | 978-1-5386-6026-3 |
ISSN: | 1050-4729 |
期刊: | 2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA) |
起始頁: | 2260 |
結束頁: | 2266 |
顯示於類別: | 會議論文 |