標題: | 擬人機械手臂基於影像之適應性抓取設計 Vision-Based Design for Adaptive Grasping of a Humanoid Robot Arm |
作者: | 蔡仕晟 Tsai, Shih-Cheng 宋開泰 電控工程研究所 |
關鍵字: | 機械手臂;影像伺服;安全;potential field;manipulator;visual servo;safety;potential field |
公開日期: | 2011 |
摘要: | 本論文主旨在設計機器手臂之安全抓取控制系統。利用裝置於機器人頭部之Kinect攝影機做環境偵測,機器人能即時偵測環境中各個物體在空間中分怖位置,經由目標物,辨識及定位自主引導機械手臂至適當抓取位置,執行抓取任務。本論文發展出一個環境描述與特徵點比對的方法,利用Kinect之深度影像,將各個物體切割成數個平面來表示,簡化複雜環境的描述,縮減其偵測的時間。所發展之方法能在彩色影像中找到物體對應的平面,再藉由擷取各平面的特徵點,與所建立的資料庫進行比對,藉由縮小比對的範圍,相較於傳統對整張影像去比對,使比對的時間可以大為減少。此外,本論文提出一套手臂的安全行為控制策略;考慮到機械手臂在複雜居家環境之抓取過程中必須確保手臂的安全,我們設計了兩個安全指數,其中一個圓柱安全指標,定義出環境中各物體對於手臂移動的影響程度;另外一個區域安全指數,將環境分成安全、不確定與危險三種區域,使機械手臂能往空間中較安全區域移動。本論文發展出基於Potential field之路徑規劃演算法,將上述的安全指數加入Potential field的斥力與引力中,使手臂能閃避障礙物並完成抓取的任務。經由實驗驗證,所發站之方法卻能達成設定的功能。 The objective of this thesis is to design a control of a humanoid robot arm for safe grasping. The robot uses Kinect to recognize and find the target object in the environment and grasp it in real-time. First, we use gradient direction in a depth image to segment environment to several planes. Then, speed up robust feature(SURF) is used to match features between those planes and locate the target object. This approach effectively speeds up the matching operation by decreasing the area to match. Moreover, this study proposes a design for safe operation of the robot arm. We design two safe indices, one defines the degree of influence of obstacles to the manipulator. Another index classifies the workspace in three regions , namely safe, uncertainty and danger region. The robot employs these indices to move to safe regions. Finally, we integrate these indices to repulsive and attractive force in a Potential field for motion planning. The robot arm cam effectively avoid obstacles and complete the grasping task. |
URI: | http://140.113.39.130/cdrfb3/record/nctu/#GT079812541 http://hdl.handle.net/11536/46898 |
Appears in Collections: | Thesis |
Files in This Item:
If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.