完整後設資料紀錄
DC 欄位語言
dc.contributor.authorHuang, CDen_US
dc.contributor.authorChung, IFen_US
dc.contributor.authorPal, NRen_US
dc.contributor.authorLin, CTen_US
dc.date.accessioned2014-12-08T15:41:35Z-
dc.date.available2014-12-08T15:41:35Z-
dc.date.issued2003en_US
dc.identifier.isbn3-540-40408-2en_US
dc.identifier.issn0302-9743en_US
dc.identifier.urihttp://hdl.handle.net/11536/28276-
dc.description.abstractThe success of a classification system depends heavily on two things: the tools being used and the features considered. For the bioinformatics applications the role of appropriate features has not been paid adequate importance. In this investigation we use two novel ideas. First, we use neural networks where each input node is associated with a gate. At the beginning of the training all gates are almost closed, i.e., no feature is allowed to enter the network. During the training, depending on the requirements, gates are either opened or closed. At the end of the training, gates corresponding to good features are completely opened while gates corresponding to bad features are closed more tightly. And of course, some gates may be partially open. So the network can not only select features in an online manner when the learning goes on, it also does some feature extraction. The second novel idea is to use a hierarchical machine learning architecture. Where at the first level the network classifies the data into four major folds : all alpha, all beta, alpha + beta and alpha/beta. And in the next level we have another set of networks, which further classifies the data into twenty seven folds. This approach helps us to achieve the following. The gating network is found to reduce the number of features drastically. It is interesting to observe that for the first level using just 50 features selected by the gating network we can get a comparable test accuracy as that using 125 features using neural classifiers. The process also helps us to get a better insight into the folding process. For example, tracking the evolution of different gates we can find which characteristics (features) of the data are more important for the folding process. And, of course, it reduces the computation time. The use of the hierarchical architecture helps us to get a better performance also.en_US
dc.language.isoen_USen_US
dc.titleMachine learning for multi-class protein fold classification based on neural networks with feature gatingen_US
dc.typeArticle; Proceedings Paperen_US
dc.identifier.journalARTIFICAIL NEURAL NETWORKS AND NEURAL INFORMATION PROCESSING - ICAN/ICONIP 2003en_US
dc.citation.volume2714en_US
dc.citation.spage1168en_US
dc.citation.epage1175en_US
dc.contributor.department電控工程研究所zh_TW
dc.contributor.departmentInstitute of Electrical and Control Engineeringen_US
dc.identifier.wosnumberWOS:000185378100139-
顯示於類別:會議論文