標題: | 模 糊 認 知 學 習 法 之 新 研 究 成 果 New results on fuzzy perceptron learning algorithm |
作者: | 張弘 穎 Chang, Horng-Ying 張 志 永 Jyh-Yeong Chang 電控工程研究所 |
關鍵字: | 模 糊 理 論;認知學習法;頂點法則;模糊口袋演算法;模糊若則法則;水 準 集 合;fuzzy theory;perceptron;vertex method;fuzzy pocket;fuzzy if-then rule;level-sets |
公開日期: | 1995 |
摘要: | 本論文改進一套模糊認知類神經網路學習法則,此方法可用於同時處理 數值資料及模糊若則法則來表示的專家知識分類器。我們擴展傳統的認知 網路成二階的認知網路,以提供更大彈性的判別函數。為了使類神經網路 能夠處理語文變數,模糊集合的水準集合理論被引進至認知學習方法中。 在輸入模糊函數的不同水準下,根據輸出單元之函數及其對應之非模糊目 標值,可以導出模糊認知學習演算法則,本論文主要利用頂點法來求得輸出 函數之極值所對應之點,亦即此方法可大為提高模糊認知學習法之收歛速 度與所需之時間。此外,模糊認知學習法能夠利用修正的口袋演算法--模 糊口袋演算法--解決無法分開的分類問題,例如有模糊集合重疊的情況。 大量電腦模擬的結果將顯示模糊認知學習法則的效果,並可同時改進由田 中(Tanaka)等人提出之模糊誤差反傳遞法之速度慢,與不能收歛至正確值 的問題。 This thesis modifies a learning algorithm of fuzzy perceptron neural networksfor classifier that utilize expert knowledge represented by fuzzy if-then rulesas well as numerical data. We extend the conventional linear perceptron networkto a second order one, which can provide much more flexibility for discriminantfunction. In order to handle linguistic variables in neural networks, level setsof fuzzy set theory are incorporated into perceptron neural learning. At differentlevels of the input fuzzy number, the fuzzy perceptron algorithm is derived fromthe fuzzy output function and the corresponding nonfuzzy target output that indicatesthe correct class of the fuzzy input vector. The vertex method is borrowed andmodifies to obtain the extreme point of the fuzzy output function which can greatlyreduce the computational complexity and hence the time required for perceptronlearning algorithm. Moreover, the pocket algorithm is modified to our fuzzy perceptronlearning scheme, called fuzzy pocket algorithm, to solve the nonseparability problem,such as overlapping fuzzy inputs. Intensive computer simulations demonstrate theeffect of the modified algorithm, which solve the inaccuracy and speed problemsencountered in the Fuzzy BP algorithm of Tanaka. |
URI: | http://140.113.39.130/cdrfb3/record/nctu/#NT840327042 http://hdl.handle.net/11536/60299 |
Appears in Collections: | Thesis |