Title: 利用電腦視覺自動車航行作室內安全巡邏及入侵人物偵測與追蹤
Indoor Security Patrolling with Intruding Person Detection and Following Capabilities by Vision-Based Autonomous Vehicle Navigation
Authors: 王佑慈
蔡文祥
資訊科學與工程研究所
Keywords: 自動車導航;人物追蹤;人物偵測;相機校正;安全巡邏;vehicle navigation;human detection;human tracking;camera calibration;security patrolling
Issue Date: 2005
Abstract: 本研究主要是提出一套基於電腦視覺技術,讓自動車航行在室內環境中具有偵測與追蹤人物的安全監控能力。我們利用一台具有機器手臂裝置的自動車做為實驗平台,並且利用無線操控的方式讓自動車航行在室內的環境中。我們提出了基於角度的攝影機校正方式,根據角度資訊我們可以計算偵測到的人物和自動車的距離。我們利用臉部顏色和形狀特徵來偵測影像中的人物,並且加上移動物的偵測進行確認。在偵測到人物的臉部區域後,自動學習人物服裝的顏色特徵,根據人物服裝偵測進行鎖定人物的追蹤。我們同時提供自動車逃脫策略,當入侵人物企圖接近自動車時,自動車會依循先前的軌跡後退遠離入侵者直到和入侵者保持安全距離。最後我們以成功的偵測和追蹤實驗證明本利統的完整性與可行性。
A vision-based vehicle system for security patrolling by human detection and tracking in indoor environments is proposed. A vehicle with wireless control and a web camera is used as a test bed. A robot arm is equipped on the vehicle to hold the camera at a higher position and is used to change the orientation of the camera. First, a camera calibration method is proposed by use of a technique of angular mapping, which is based on the concept of spherical coordinate system. Next, a human detection module and a human tracking module are proposed, which use a color feature of the face and that of the rough shape of the human body to recognize human beings. To track a target person, a cloth region intersection method is proposed to predict the motion of the person. In addition, a vehicle escape function is proposed, which is designed for the vehicle to move away from offensive strangers by a technique of safe-distance keeping. Good experimental results show the flexibility and feasibility of the proposed methods for the application of indoor security patrolling.
URI: http://140.113.39.130/cdrfb3/record/nctu/#GT009323520
http://hdl.handle.net/11536/79047
Appears in Collections:Thesis


Files in This Item:

  1. 352002.pdf
  2. 352003.pdf
  3. 352004.pdf

If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.