Full metadata record
DC FieldValueLanguage
dc.contributor.authorWang, Guan-Xiangen_US
dc.contributor.authorHsu, Chung-Chienen_US
dc.contributor.authorChien, Jen-Tzungen_US
dc.date.accessioned2017-04-21T06:48:59Z-
dc.date.available2017-04-21T06:48:59Z-
dc.date.issued2016en_US
dc.identifier.isbn978-1-4799-9988-0en_US
dc.identifier.issn1520-6149en_US
dc.identifier.urihttp://hdl.handle.net/11536/136363-
dc.description.abstractDeep neural network is now a new trend towards solving different problems in speech processing. In this paper, we propose a discriminative deep recurrent neural network (DRNN) model for monaural speech separation. Our idea is to construct DRNN as a regression model to discover the deep structure and regularity for signal reconstruction from a mixture of two source spectra. To reinforce the discrimination capability between two separated spectra, we estimate DRNN separation parameters by minimizing an integrated objective function which consists of two measurements. One is the within source reconstruction errors due to the individual source spectra while the other conveys the discrimination information which preserves the mutual difference between two source spectra during the supervised training procedure. This discrimination information acts as a kind of regularization so as to maintain between-source separation in monaural source separation. In the experiments, we demonstrate the effectiveness of the proposed method for speech separation compared with the other methods.en_US
dc.language.isoen_USen_US
dc.subjectdeep learningen_US
dc.subjectdiscriminative learningen_US
dc.subjectneural networken_US
dc.subjectmonaural speech separationen_US
dc.titleDISCRIMINATIVE DEEP RECURRENT NEURAL NETWORKS FOR MONAURAL SPEECH SEPARATIONen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGSen_US
dc.citation.spage2544en_US
dc.citation.epage2548en_US
dc.contributor.department電機工程學系zh_TW
dc.contributor.departmentDepartment of Electrical and Computer Engineeringen_US
dc.identifier.wosnumberWOS:000388373402137en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper