Full metadata record
DC FieldValueLanguage
dc.contributor.authorHsu, CNen_US
dc.contributor.authorHuang, HJen_US
dc.contributor.authorSchuschel, Den_US
dc.date.accessioned2014-12-08T15:42:34Z-
dc.date.available2014-12-08T15:42:34Z-
dc.date.issued2002-04-01en_US
dc.identifier.issn1083-4419en_US
dc.identifier.urihttp://dx.doi.org/10.1109/3477.990877en_US
dc.identifier.urihttp://hdl.handle.net/11536/28899-
dc.description.abstractThis paper presents a novel feature selection approach for backprop neural networks (NNs). Previously, a feature selection technique known as the wrapper model was shown effective for decision trees induction. However, it is prohibitively expensive when applied to real-world neural net training characterized by large volumes of data and many feature choices. Our approach incorporates a weight analysis-based heuristic called artificial neural net input gain measurement approximation (ANNIGMA) to direct the search in the wrapper model and allows effective feature selection feasible for neural net applications. Experimental results on standard datasets show that this approach can efficiently reduce the number of features while maintaining or even improving the accuracy. We also report two successful applications of our approach in the helicopter maintenance applications.en_US
dc.language.isoen_USen_US
dc.subjectcurse of dimensionalityen_US
dc.subjectfeature selectionen_US
dc.subjectneural networks (NNs)en_US
dc.subjectwrapper modelen_US
dc.titleThe ANNIGMA-wrapper approach to fast feature selection for neural netsen_US
dc.typeArticleen_US
dc.identifier.doi10.1109/3477.990877en_US
dc.identifier.journalIEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICSen_US
dc.citation.volume32en_US
dc.citation.issue2en_US
dc.citation.spage207en_US
dc.citation.epage212en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000174455700008-
dc.citation.woscount33-
Appears in Collections:Articles


Files in This Item:

  1. 000174455700008.pdf

If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.