Full metadata record
DC FieldValueLanguage
dc.contributor.authorTsai, Chi-Yien_US
dc.contributor.authorSong, Kai-Taien_US
dc.date.accessioned2014-12-08T15:25:36Z-
dc.date.available2014-12-08T15:25:36Z-
dc.date.issued2005en_US
dc.identifier.isbn0-7803-9044-Xen_US
dc.identifier.urihttp://hdl.handle.net/11536/18008-
dc.description.abstractThis paper presents a novel visual tracking control scheme, which is robust to the quantization error in practical implementation. The proposed control scheme is based on an error model of camera-object visual interaction in image plane for unicycle mobile robots. In order to overcome the quantization error encountered in practical system, a stability necessary condition for ensuring global asymptotic stability of the closed-loop visual tracking system is derived through Lyapunov's direct method. The robust control law is then proposed to guarantee that the visual tracking system satisfies the stability necessary condition based on Lyapunov theory. Experimental results verify the effectiveness of the proposed control scheme, both in terms of tracking performance and system convergence.en_US
dc.language.isoen_USen_US
dc.subjectvisual tracking controlen_US
dc.subjectvisual servoingen_US
dc.subjectmobile robotsen_US
dc.subjecthuman-robot interactionen_US
dc.titleRobust visual tracking control of mobile robots based on an error model in image planeen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2005 IEEE International Conference on Mechatronics and Automations, Vols 1-4, Conference Proceedingsen_US
dc.citation.spage1218en_US
dc.citation.epage1223en_US
dc.contributor.department電控工程研究所zh_TW
dc.contributor.departmentInstitute of Electrical and Control Engineeringen_US
dc.identifier.wosnumberWOS:000238860802017-
Appears in Collections:Conferences Paper