Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chen, Jiun-Fu | en_US |
dc.contributor.author | Wang, Chieh-Chih | en_US |
dc.contributor.author | Wu, Eric Hsiao-Kuang | en_US |
dc.contributor.author | Chou, Cheng-Fu | en_US |
dc.date.accessioned | 2020-10-05T02:01:55Z | - |
dc.date.available | 2020-10-05T02:01:55Z | - |
dc.date.issued | 2020-09-01 | en_US |
dc.identifier.issn | 1932-8184 | en_US |
dc.identifier.uri | http://dx.doi.org/10.1109/JSYST.2020.2963842 | en_US |
dc.identifier.uri | http://hdl.handle.net/11536/155337 | - |
dc.description.abstract | In stroke rehabilitation systems and applications, reliability, accuracy, and occlusion should be taken into consideration. Unfortunately, most existing approaches focus primarily on the first two issues. However, during the stroke rehabilitation process, occlusion leads to incorrect judgements even for medical staff. In order to tackle these three important issues simultaneously, we propose a heterogeneous sensor fusion framework composed of an RGB-D camera and a wearable device to consider occlusion and provide robust joint locations for rehabilitation. To fuse multiple sensor measurements when compensating for occlusion, we apply heterogeneous sensor simultaneous localization, tracking, and modeling to estimate the locations of joints and sensors and construct an upper extremity model for occlusion situation. Virtual measurements based on this model are used to estimate the joint's location during occlusion, and a virtual relative orientation technique is applied to relax system limitations regarding orientation. Experimental results using the proposed approach with synthetic data and data collected from ten subjects show a 4.6 cm error on average and about 15 cm error on average during occlusion. This constitutes a more robust approach for stroke patients which takes into account these three important issues. | en_US |
dc.language.iso | en_US | en_US |
dc.subject | Stroke (medical condition) | en_US |
dc.subject | Extremities | en_US |
dc.subject | Cameras | en_US |
dc.subject | Tracking | en_US |
dc.subject | Kinematics | en_US |
dc.subject | Sensor fusion | en_US |
dc.subject | Motion tracking | en_US |
dc.subject | RGB-D camera | en_US |
dc.subject | sensor fusion | en_US |
dc.subject | stroke rehabilitation | en_US |
dc.subject | upper extremity model | en_US |
dc.subject | wearable device | en_US |
dc.title | Simultaneous Heterogeneous Sensor Localization, Joint Tracking, and Upper Extremity Modeling for Stroke Rehabilitation | en_US |
dc.type | Article | en_US |
dc.identifier.doi | 10.1109/JSYST.2020.2963842 | en_US |
dc.identifier.journal | IEEE SYSTEMS JOURNAL | en_US |
dc.citation.volume | 14 | en_US |
dc.citation.issue | 3 | en_US |
dc.citation.spage | 3570 | en_US |
dc.citation.epage | 3581 | en_US |
dc.contributor.department | 電機工程學系 | zh_TW |
dc.contributor.department | Department of Electrical and Computer Engineering | en_US |
dc.identifier.wosnumber | WOS:000566404500049 | en_US |
dc.citation.woscount | 0 | en_US |
Appears in Collections: | Articles |