標題: 以歸納學習由圖形輪廓中擷取知識
Knowledge Discovery from Shape Contours by Using Inductive Learning
作者: 徐瑞騏
Hsu Jui Chi
黃書淵
Hwang Shu Yuen
資訊科學與工程研究所
關鍵字: 機器學習;歸納邏輯程式;Hotelling 轉換;k-曲率演算法;machine learning;inductive logic programming;Hotelling transform;k-curvature algorithm.
公開日期: 1994
摘要: 電腦視覺系統常利用形狀的資訊來建立物體的幾何結構,並運用相關領域 的知識來辨識此物。然而,相關領域的知識並不是唾手可得。我們常需藉 由專家來幫助,進而獲得相關的領域知識,為了能讓電腦獨立地運作,許 多的應用程式利用機器學習去獲得領域知識來完成辨識,但其所得的領域 知識常以數值來表示,而非我們所熟悉的文字表示法。我們試著運用一個 歸納邏輯程式系統:FOIL來解決上述的問題。我們的演算法大致如下:先 平滑圖形輪廓,繼而以Hotelling 轉換法確立主軸,並用k-曲率演算法劃 分區段,之後針對每一區段,找初其本身的性質,並探討相互間的關係, 適切地將這些特性轉換為文字符號表示,最終則利用FOIL歸納求出相關的 領域知識。實驗結果顯示,我們的方法的確初步地達成我們的期望。 Computer vision systems often use shape information to construct a geometric structure of an object and recognize this object by using the domain knowledge. However, the domain knowledge is hard to obtain. We usually acquire the domain knowledge through interaction with experts. To solve the problem to ask for experts, machine learning is introduced. Many applications perform well to obtain the domain knowledge by using machine learning. However, one deficiency is that the knowledge acquired by machine learning is not easily understandable to human. We tried to solve the deficiency by using the new technique in machine learning, i.e., inductive logic programming (ILP). To achieve our attempt, we use a smoothing method to smooth a contour, determine the principal axes of each contour by using Hotelling transform, utilize the modified k-curvature algorithm to segment the processed contours. After obtaining the segemnts, we propose the vectors to describe the properties of each segment and the interrelations between segments. We transform these vectors into symbolic representations. Finally, we use FOIL which is an ILP system to read these tuples and produce rules. These rules reflect the characteristics of shape contours and the meanings of these rules are intuitive to us. Our method can be an aided tool to discover knowledge.
URI: http://140.113.39.130/cdrfb3/record/nctu/#NT830392062
http://hdl.handle.net/11536/58986
Appears in Collections:Thesis