Full metadata record
DC FieldValueLanguage
dc.contributor.authorTsai, Chi-Yien_US
dc.contributor.authorSong, Kai-Taien_US
dc.contributor.authorDutoit, Xavieren_US
dc.contributor.authorVan Brussel, Hendriken_US
dc.contributor.authorNuttin, Marnixen_US
dc.date.accessioned2014-12-08T15:16:18Z-
dc.date.available2014-12-08T15:16:18Z-
dc.date.issued2007en_US
dc.identifier.isbn978-1-4244-0789-7en_US
dc.identifier.urihttp://hdl.handle.net/11536/12089-
dc.description.abstractThis paper presents a novel design of a robust visual tracking control system, which consists of a visual tracking controller and a visual state estimator. This system facilitates human-robot interaction of a unicycle-modeled mobile robot equipped with a tilt camera. Based on a novel dual-Jacobian visual interaction model, a dynamic motion target can be tracked using a single visual tracking controller without target's 3D velocity information. The visual state estimator aims to estimate the optimal system state and target image velocity, which is used later by the visual tracking controller. To achieve this, a self-tuning Kalman filter is proposed to estimate interesting parameters online in real-time. Further, because the proposed method is fully working in image space, the computational complexity and the sensor/camera modeling errors can be reduced. Experimental results validate the effectiveness of the proposed method, in terms of tracking performance, system convergence, and robustness.en_US
dc.language.isoen_USen_US
dc.subjectsystem modellingen_US
dc.subjectvisual tracking controlen_US
dc.subjectvisual estimationen_US
dc.subjectself-tuning Kalman filteren_US
dc.titleRobust mobile robot visual tracking control system using self-tuning Kalman filteren_US
dc.typeProceedings Paperen_US
dc.identifier.journal2007 International Symposium on Computational Intelligence in Robotics and Automationen_US
dc.citation.spage137en_US
dc.citation.epage142en_US
dc.contributor.department電控工程研究所zh_TW
dc.contributor.departmentInstitute of Electrical and Control Engineeringen_US
dc.identifier.wosnumberWOS:000249266100024-
Appears in Collections:Conferences Paper