Full metadata record
DC FieldValueLanguage
dc.contributor.authorChien, Jen-Tzungen_US
dc.contributor.authorHuang, Pei-Wenen_US
dc.date.accessioned2017-04-21T06:49:22Z-
dc.date.available2017-04-21T06:49:22Z-
dc.date.issued2016en_US
dc.identifier.isbn978-1-5090-0746-2en_US
dc.identifier.issn2161-0363en_US
dc.identifier.urihttp://hdl.handle.net/11536/134554-
dc.description.abstractDeep neural network (DNN) is trained according to a mini-batch optimization based on the stochastic gradient descent algorithm. Such a stochastic learning suffers from instability in parameter updating and may easily trap into local optimum. This study deals with the stability of stochastic learning by reducing the variance of gradients in optimization procedure. We upgrade the optimization from the stochastic dual coordinated ascent (SDCA) to the accelerated SDCA without duality (or dual-free ASDCA). This optimization incorporates the momentum method to accelerate the updating rule where the variance of gradients can be reduced. Using dual-free ASDCA, the optimization of dual function of SDCA in a form of convex loss is implemented by directly optimizing the primal function with respect to pseudo-dual parameters. The non-convex optimization in DNN training can be resolved and accelerated. Experimental results illustrate the reduction of training loss, variance of gradients and word error rate by using the proposed optimization for DNN speech recognition.en_US
dc.language.isoen_USen_US
dc.subjectOptimization algorithmen_US
dc.subjectvariance reductionen_US
dc.subjectdeep neural networken_US
dc.subjectspeech recognitionen_US
dc.titleVARIANCE REDUCTION FOR OPTIMIZATION IN SPEECH RECOGNITIONen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2016 IEEE 26TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP)en_US
dc.contributor.department電機工程學系zh_TW
dc.contributor.departmentDepartment of Electrical and Computer Engineeringen_US
dc.identifier.wosnumberWOS:000392177200056en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper