Full metadata record
DC FieldValueLanguage
dc.contributor.authorChi, Tai-Shihen_US
dc.contributor.authorYeh, Lan-Yingen_US
dc.contributor.authorHsu, Chin-Chengen_US
dc.date.accessioned2015-07-21T08:28:21Z-
dc.date.available2015-07-21T08:28:21Z-
dc.date.issued2012-03-01en_US
dc.identifier.issn1868-5137en_US
dc.identifier.urihttp://dx.doi.org/10.1007/s12652-011-0088-5en_US
dc.identifier.urihttp://hdl.handle.net/11536/124609-
dc.description.abstractMost speech emotion recognition studies consider clean speech. In this study, statistics of joint spectro-temporal modulation features are extracted from an auditory perceptual model and are used to detect the emotion status of speech under noisy conditions. Speech samples were extracted from the Berlin Emotional Speech database and corrupted with white and babble noise under various SNR levels. This study investigates a clean train/noisy test scenario to simulate practical conditions with unknown noisy sources. Simulations demonstrate the redundancy of the proposed spectro-temporal modulation features and further consider the dimensionality reduction. The proposed modulation features achieve higher recognition rates of speech emotions under noisy conditions than (1) conventional mel-frequency cepstral coefficients combined with prosodic features; (2) official acoustic features adopted in the INTERSPEECH 2009 Emotion Challenge. Adding modulation features increased the recognition rates of INTERSPEECH proposed features by approximately 7% for all tested SNR conditions (20-0 dB).en_US
dc.language.isoen_USen_US
dc.subjectRobust emotion recognitionen_US
dc.subjectSpectro-temporal modulationen_US
dc.titleRobust emotion recognition by spectro-temporal modulation statistic featuresen_US
dc.typeArticleen_US
dc.identifier.doi10.1007/s12652-011-0088-5en_US
dc.identifier.journalJOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTINGen_US
dc.citation.spage47en_US
dc.citation.epage60en_US
dc.contributor.department電機工程學系zh_TW
dc.contributor.departmentDepartment of Electrical and Computer Engineeringen_US
dc.identifier.wosnumberWOS:000209331500006en_US
dc.citation.woscount3en_US
Appears in Collections:Articles