Full metadata record
DC FieldValueLanguage
dc.contributor.author傅俊傑en_US
dc.contributor.authorFu, Jun-Jieen_US
dc.contributor.author鄭泗東en_US
dc.contributor.authorCheng, Stoneen_US
dc.date.accessioned2015-11-26T01:07:28Z-
dc.date.available2015-11-26T01:07:28Z-
dc.date.issued2010en_US
dc.identifier.urihttp://140.113.39.130/cdrfb3/record/nctu/#GT079702503en_US
dc.identifier.urihttp://hdl.handle.net/11536/44186-
dc.description.abstract以往對於情緒分類或辨識的研究往往假設整首音樂或是每個音樂段落為單 一穩定情緒,並總結為此音樂帶給聽者的”心情”。本研究的所建構的系統以人的 聆聽過程為概念,聽者當下的情緒反應應該是受前一段時間所聽到的音樂特徵所 影響,這比較接近為一種連續變化的過程。本系統分析音樂與情緒之間的關連模 型選用的是由Thayer 所提出的情緒模型,共分為四大種類的情緒:(1)舒適、(2) 哀傷、(3)焦慮、(4)振奮。在系統的訓練模式中,將以大量附有人工標記Thayer 之情緒類別的音樂片段進行特徵萃取,主要特徵為音量、音樂事件密集度的追 蹤、調性的追蹤。使用者當下可能感受到的情緒以情緒平面上以一個紅點(情緒 指標)代表,由每個時刻個音樂特徵計算情緒指標於Thayer 情緒平面上的得分與 位移軌跡,最後利用軌跡座標與事先標記好的類別以GMM 找出各個類別的邊 界。在使用者模式中,則是將情緒軌跡位移展示在含邊界的情緒平面上,方便使 用者了解各個情緒間的關係,並且增強使用上的經驗感受。zh_TW
dc.description.abstractAs the technology of artificial intelligence and machine learning develops, people are pursuing some applications to interact with computers in a more humanized and personalized way. In the recent years, affective computing for the content-based information retrieval is a very popular research of both image and sound signals. Using emotions as an index for Music Information Retrieval is also a challenge issue for researchers. Music is always plays an important role in people’s everyday life, whether they had been experienced professional music education or not, all the different kinds of music exactly could arouse and transmit the different emotional responses of human. Although the emotion is a subjective feeling, the same music might bring different emotions to different persons, but in general, there is still a trend of peoples’ emotional responses. Most of the state of art research about music emotion classification or prediction assumes the music is always in a constant emotion or it is in a constant emotion at each possible part. The idea of the system proposed in this paper based on human’s listening processing, the emotion response at an instant of time are primary influenced by the features that you heard in the past few seconds, and the emotion responses will be displayed by the moving dot and it’s trajectory on the Thayer’s emotion plane, this will enhances the listening experiences of listener, finally the trajectory could also be clues for evaluate the average emotion or mood of the music.en_US
dc.language.isozh_TWen_US
dc.subject音樂資訊檢索zh_TW
dc.subject音樂情緒辨識zh_TW
dc.subject音樂情緒追蹤zh_TW
dc.subject特徵萃取zh_TW
dc.subject高斯混和模型zh_TW
dc.subjectMusic Information Retrievalen_US
dc.subjectMusic Mood Recognizationen_US
dc.subjectMusic Emotion Trackingen_US
dc.subjectFeature Extractionen_US
dc.subjectGMMen_US
dc.title具時變情緒軌跡介面之自動音樂情緒追蹤系統zh_TW
dc.titleEmotion Locus Tracking System for Automatic Mood Detection and Classification of Music Signalsen_US
dc.typeThesisen_US
dc.contributor.department工學院聲音與音樂創意科技碩士學位學程zh_TW
Appears in Collections:Thesis


Files in This Item:

  1. 250302.pdf

If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.