標題: Divide-and-conquer learning and modular perceptron networks
作者: Fu, HC
Lee, YP
Chiang, CC
Pao, HT
資訊工程學系
管理科學系
Department of Computer Science
Department of Management Science
關鍵字: divide-and-conquer learning;modular perceptron network;multilayer perceptron;weight estimation
公開日期: 1-Mar-2001
摘要: A novel modular perceptron network (MPN) and divide-and-conquer learning (DCL) schemes for the design of modular neural networks are proposed, When a training process in a multilayer perceptron fails into a local minimum or stalls in a flat region, the proposed DCL scheme is applied to divide the current training data region (e.g,, a hard to be learned training set) into two easier (hopely) to be learned regions. The learning process continues when a self-growing perceptron network and its initial weight estimation are constructed for one of the newly partitioned regions, Another partitioned region will resume the training process on the original perceptron network. Data region partitioning, weight estimating and learning are iteratively repeated until all the training data are completely learned by the MPN We have evaluated and compared the proposed MPN with several representative neural networks on the two-spirals problem and real-world dataset, The MPN achieves better weight learning performance by requiring much less data presentations (99.01% similar to 87.86% lesser) during the network training phases, and better generalization performance (4.0% better), and less processing time (2.0% similar to 81.3% lesser) during the retrieving phase, On learning the real-world data, the MPN's show less overfitting compared to single MLP. In addition, due to its self-growing and fast local learning characteristics, the modular network (MPN) can easily adapt to on-line and/or incremental Learning requirements for a rapid changing environment.
URI: http://dx.doi.org/10.1109/72.914522
http://hdl.handle.net/11536/29812
ISSN: 1045-9227
DOI: 10.1109/72.914522
期刊: IEEE TRANSACTIONS ON NEURAL NETWORKS
Volume: 12
Issue: 2
起始頁: 250
結束頁: 263
Appears in Collections:Articles


Files in This Item:

  1. 000167886700006.pdf

If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.