Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 黃鑫茂 | en_US |
dc.contributor.author | Shin-Mao Huang | en_US |
dc.contributor.author | 陳永平 | en_US |
dc.contributor.author | Yon-Ping Chen | en_US |
dc.date.accessioned | 2014-12-12T02:24:11Z | - |
dc.date.available | 2014-12-12T02:24:11Z | - |
dc.date.issued | 1999 | en_US |
dc.identifier.uri | http://140.113.39.130/cdrfb3/record/nctu/#NT880591055 | en_US |
dc.identifier.uri | http://hdl.handle.net/11536/66288 | - |
dc.description.abstract | 本論文在研究一種新的演算法則,它利用基因演算法來訓練類神經網路的初始值,且其染色體為浮點數形式,這樣一來可節省計算時間。它不僅可改善倒傳遞演算法容易掉入區域最小解的缺點而且能夠克服基因法則無法有效收斂至鄰近區域最小解的困難。更進一步的研究單一基因交配比全部基因交配更快收斂至最佳解。最後,由模擬結果證實了此種演算法有較佳的收斂特性並且收斂時間也明顯的縮短。 | zh_TW |
dc.description.abstract | This thesis investigates a novel neural network training technique, which employs the genetic algorithm to finding the initial values of the neural network. It is represented by a chromosome containing parameters in floating-point, so that the convergence rate to the minima becomes faster. This hybrid algorithm can overcome not only the drawback of easily slumping into local minima of back-propagation but also the genetic algorithm’s defect that can’t efficiently converge to the minima of the neighborhood. Further, the thesis shows that a gene changing one by one is better than that changing totally at once. Finally, the results of computer simulations reveal that this algorithm has a better convergence property, the time of global searching is obviously decreased. 1.1 Motivation and Purpose………………………………………….. 1 1.2 Organization of the Thesis……………………………………….. 3 Chapter 2 Introduction to genetic algorithm and neural network 4 2.1 History of GA and NN……………………………………. 4 2.2 Genetic Algorithm Theory Review……………………… 6 2.2.1 Basic Operations and Features……………………… 6 2.2.2 Simple Genetic Algorithm…………………………… 9 2.3 Neural Network Theory Review………………………… 13 2.3.1 Basic Operations and Features…………………….. 14 2.3.2 Back-propagation algorithm…………………………. 20 Chapter 3 Neural Network Training Technique by Using Genetic Algorithm 29 3.1 Specialized Operators and Concepts………………… 29 3.1.1 Expression of Population……..…………………… 29 3.1.2 Rank-based Fitness…………………………………… 30 3.1.3 Rank-based Reproduction…………..……………… 32 3.1.4 Age and Lifetime……………………………………… 33 3.1.5 Pocket algorithm……………………………………… 34 3.1.6 Floating Crossover and Mutation…………………… 34 3.2 Neural Network Training Technique.………….……… 35 3.2.1 Weight…………………………………………………… 35 3.2.2 Genetic Algorithm……………………………………… 36 3.2.2.1 Parametric Crossover…………………………….. 36 3.2.2.2 Parametric Mutation……………………………… 38 3.2.3 Connection………………………………………………. 38 3.2.4 Back-Propagation……………………………………… 39 3.2.5 Architecture……………………………………………. 42 3.3 A Novel Genetic Algorithm……………………………… 43 3.3.1 Parametric Crossover…………………………………. 43 3.3.2 Parametric Mutation…………………………………… 44 Chapter 4 Simulation Results and Applications 45 4.1 Simulation Results………………………………………. 45 4.1.1 Problem Statement…………………….………………. 45 4.1.2 Implementation…………………………….…………. 47 4.1.3 Simulation Results……………………………………. 50 4.1.4 Analysis………………………………………………… 52 4.2 Application………………………….…………………… 52 4.2.1 Problem Statement……………………………………… 52 4.2.2 Implementation………………………………………… 53 4.2.3 Simulation Result……………………………………… 55 4.2.4 Analysis………………………………………………… 57 Chapter 5 Conclusion 58 References 59 | en_US |
dc.language.iso | en_US | en_US |
dc.subject | 類神經網路 | zh_TW |
dc.subject | 基因演算法 | zh_TW |
dc.subject | 倒傳遞演算法 | zh_TW |
dc.subject | 實數編碼 | zh_TW |
dc.subject | neural network | en_US |
dc.subject | genetic algorithm | en_US |
dc.subject | BP | en_US |
dc.subject | real coding | en_US |
dc.title | 利用基因演算法訓練類神經網路的研究 | zh_TW |
dc.title | A Novel Neural Network Training Technique by Using Genetic Algorithm | en_US |
dc.type | Thesis | en_US |
dc.contributor.department | 電控工程研究所 | zh_TW |
Appears in Collections: | Thesis |