完整後設資料紀錄
DC 欄位語言
dc.contributor.authorWang, YJen_US
dc.contributor.authorLin, CTen_US
dc.date.accessioned2019-04-02T05:58:57Z-
dc.date.available2019-04-02T05:58:57Z-
dc.date.issued1998-12-01en_US
dc.identifier.issn0893-6080en_US
dc.identifier.urihttp://dx.doi.org/10.1016/S0893-6080(98)00091-4en_US
dc.identifier.urihttp://hdl.handle.net/11536/148046-
dc.description.abstractThis article proposes a new second-order learning algorithm for training the multilayer perceptron (MLP) networks. The proposed algorithm is a revised Newton's method. A forward-backward propagation scheme is first proposed for network computation of the Hessian matrix, H, of the output error function of the MLP. A block Hessian matrix, H-b, is then defined to approximate and simplify H. Several lemmas and theorems are proved to uncover the important properties of H and H-b, and verify the good approximation of H-b to H; H-b preserves the major properties of H. The theoretic analysis leads to the development of an efficient way for computing the inverse of H-b recursively. In the proposed second-order learning algorithm, the least squares estimation technique is adopted to further lessen the local minimum problems. The proposed algorithm overcomes not only the drawbacks of the standard backpropagation algorithm (i.e. slow asymptotic convergence rate, bad controllability of convergence accuracy, local minimum problems, and high sensitivity to learning constant), but also the shortcomings of normal Newton's method used on the MLP, such as the lack of network implementation of H, ill representability of the diagonal terms of H, the heavy computation load of the inverse of H, and the requirement of a good initial estimate of the solution (weights). Several example problems are used to demonstrate the efficiency of the proposed learning algorithm. Extensive performance (convergence rate and accuracy) comparisons of the proposed algorithm with other learning schemes (including the standard backpropagation algorithm) are also made. (C) 1998 Elsevier Science Ltd. All rights reserved.en_US
dc.language.isoen_USen_US
dc.subjectmultilayer perceptronsen_US
dc.subjectHessian matrixen_US
dc.subjectforward-backward propagationen_US
dc.subjectNewton's methoden_US
dc.subjectleast squares estimationen_US
dc.titleA second-order learning algorithm for multilayer networks based on block Hessian matrixen_US
dc.typeArticleen_US
dc.identifier.doi10.1016/S0893-6080(98)00091-4en_US
dc.identifier.journalNEURAL NETWORKSen_US
dc.citation.volume11en_US
dc.citation.spage1607en_US
dc.citation.epage1622en_US
dc.contributor.department電控工程研究所zh_TW
dc.contributor.departmentInstitute of Electrical and Control Engineeringen_US
dc.identifier.wosnumberWOS:000077631700004en_US
dc.citation.woscount14en_US
顯示於類別:期刊論文