Full metadata record
DC FieldValueLanguage
dc.contributor.author宋永正en_US
dc.contributor.authorYung-Cheng Sungen_US
dc.contributor.author孫春在en_US
dc.contributor.authorChuen-Tsai Sunen_US
dc.date.accessioned2014-12-12T02:13:31Z-
dc.date.available2014-12-12T02:13:31Z-
dc.date.issued1994en_US
dc.identifier.urihttp://140.113.39.130/cdrfb3/record/nctu/#NT830394068en_US
dc.identifier.urihttp://hdl.handle.net/11536/59093-
dc.description.abstract主動式視覺追蹤之研究近來益受重視,其重要性在很多先進的系統中已 肯定,例如:安全系統、自動看護系統、無人駕駛系統、虛擬實境及遠距 教學。在本論文中,我們嘗試提出一些在遠距教學環境中實際可行的追蹤 方法,並應用模糊邏輯控制於攝影機的移動控制。基於一個快速的計算可 使系統有快速的回應並降低系統的需求,我們提出了三個快速的追蹤演算 法。第一個演算法是基於motion energy的計算,它的特色在於計算量非 常的少並且對影像品質有較低的需求,例如物體的邊緣可以是非常的不清 楚。為了提高追蹤的精確度,我們提出了第二個追蹤演算法。它是基於 motion-edge的計算。它能有效的抽取出物體的移動邊緣。為了能有效的 克服背景物移動的問題,我們提出了第三個演算法。他利用了物體的紋理 特徵來過濾背景物的移動,因此是一個 optic-flow-texture-based的演 算法。基本上,這些方法都有計算快速及不受限於追蹤物體類型的優點。 這些演算法在遠距教學環境中並經過實際測試。 Recently, active visual tracking has become more ans moreopular. The importance of it has been pointed out in manyvanced systems, e.g., distant education, virtual reality, automatic driving systems, and security systems.n this thesis, we present three new algorithms based on atant education environmant. Our first algorithm calcuates a rough position of tracked object with just a few operations, andt work well in our experiments. It is a motion-energy-based algorithm.To track a moving object with better precision, we propose the second algorithm, which is a motion-edge-basedlgorithm. To improve the discriminating power between theacked object and the other moving objects, we propose a optic-flow-texture-based algorithm. Since the fexture features can be computed in parallel, it is suitable to implement on high-speed hardware. These proposed algorithms have been tested in a distant education environment.zh_TW
dc.language.isoen_USen_US
dc.subject視覺追蹤, 模糊邏輯控制, 遠距教學zh_TW
dc.subjectvisual tracking, fuzzy logic control, distant eductionen_US
dc.title主動式視覺追蹤系統之研究與製作zh_TW
dc.titleActive Visual Trackingen_US
dc.typeThesisen_US
dc.contributor.department資訊科學與工程研究所zh_TW
Appears in Collections:Thesis