標題: | 論情緒辨識架構中時頻調變特徵參數的強健性 On Robustness of Spectro-Temporal Modulation Features in an Emotion Recognition Framework |
作者: | 許晉誠 Hsu, Chin-Cheng 冀泰石 Chi, Tai-Shih 電信工程研究所 |
關鍵字: | 情緒;辨識;時頻調變特徵參數;強健性;Emotion;Recognition;Spectro-Temporal Modulation Features;Robustness |
公開日期: | 2011 |
摘要: | 雜訊無論在情緒辨識或其他任何應用中,都是相當困擾的問題。當前最常見的做法是採取匹配訓練(matched condition)來對抗雜訊;相反的,本文考慮不匹配條件、只訓練單一分類器來對抗各種情況下的雜訊。實驗結果顯示:就算在最嚴格的不匹配條件下,本文採用的時頻調變特徵參數組也有極為穩健的表現。文中亦討論該時頻調變特徵參數的特性以及它如何受雜訊干擾所影響。本實驗包含四項變因:兩組資料庫(Berlin Emotional Speech Database、Aibo Emotional Speech)、兩種雜訊(white noise、babble noise)、兩組特徵參數(spectro-temporal modulation features、INTERSPEECH 2009 Emotion Challenge features)、兩種訓練-測試條件(slack or strict mismatched condition)。針對資料失衡的問題,本文則提出結合效度的樣本合成方案來改善。 Noise is an annoying problem either in emotion recognition or in other applications. Previous research has considered matched condition to counter it. This article, on the contrary, considers mismatched condition which trains only one classifier that confronts all kinds of situation. Experiments show that the proposed feature set, which contains spectro-temporal modulation information, is robust, indicating that the mismatched training/testing condition is feasible. This paper also discussed the properties of the proposed features and how noise affected the features. The experiments included four variables: two databases (Aibo Emotion Corpus and Berlin Emotional Speech Database), two types of noise (additive white Gaussian noise and babble noise), two feature sets (spectro-temporal modulation features and INTERSPEECH 2009 Emotion Challenge features), and two conditions (slack and strict mismatched conditions). As for the issue of data imbalance, a synthetic method based on emotion validity was proposed to deal with it. |
URI: | http://140.113.39.130/cdrfb3/record/nctu/#GT079813533 http://hdl.handle.net/11536/47019 |
Appears in Collections: | Thesis |
Files in This Item:
If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.