完整後設資料紀錄
DC 欄位語言
dc.contributor.author嚴柏翔zh_TW
dc.contributor.author鄭泗東zh_TW
dc.contributor.authorYen, Bo-Shiangen_US
dc.contributor.authorCheng, Stoneen_US
dc.date.accessioned2018-01-24T07:38:03Z-
dc.date.available2018-01-24T07:38:03Z-
dc.date.issued2016en_US
dc.identifier.urihttp://etd.lib.nctu.edu.tw/cdrfb3/record/nctu/#GT070351904en_US
dc.identifier.urihttp://hdl.handle.net/11536/139486-
dc.description.abstract本研究進行人形機器人的肢體動作設計以表達音樂情緒,使用拉邦動作分析作為情緒勁力分析依據,以拉邦勁力圖(Laban Effort Graph)作為基底,設計不同情緒的動作,使機器人能夠根據當下的音樂演奏,展現適當之情緒動作反應。研究方法利用拉邦動作分析(Laban Movement Analysis, LMA)建立情緒與動作之關聯性,整合本實驗室開發之即時性音樂情緒追蹤系統辨識情緒,展現音樂播放時之情緒反應。本系統萃取音樂訊號的五種特徵,分別是音量、音色、調性、音樂事件密集度及音程不和諧度,情緒辨識系統參考音樂心理學文獻並引用Thayer提出之情緒模型,建立一圖形化之聲音情緒辨識介面。由音樂情緒辨識與追蹤系統產生情緒強度指令,控制 KONDO KHR-3HV 人形機器人進行實際情緒動作演示。本研究開發為以拉邦動作分析理論為基本動作分析系統,音樂訊號經辨識後依拉邦勁力圖之動作情緒成分設計動作,再以建構之拉邦舞譜(Labanotation)作為機器人動作之圖示資訊,此舞譜之撰寫目的是希望能夠精準地傳達動作指令。本研究並進行問卷調查,由數據分析探討以拉邦勁力圖為基底設計之情緒動作的正確性。此音樂情緒動作可延伸應用在電腦3D動畫設計、居家照護機器人、電腦遊戲人物動作設計、人機互動軟體及情緒感知應用。zh_TW
dc.description.abstractThis study developed a method for adding target emotions inspired by music signals to arbitrary body movements of a humanoid robot. This thesis proposes a motion rendering system that modifies arbitrary basic movements of a certain real humanoid robot to add the target emotion at intended strength. The emotional movement features based on Laban movement analysis (LMA) are adopted. The music emotions are identified and then design emotional movement according to the Laban effort graph. In this study, Thayer’s 2D model of mood is adopted as the emotion plane of classification and detection. The model is consisted of four quadrants: (i)Contentment, (ii)Depression, (iii)Anxious, (iv)Exuberance. A sequential framework is built that progressively extracts the features of music and characterizes music-induced emotions in an emotion plane to trace the real-time emotion locus of music. Five feature sets are extracted from the music signals. Feature-weighted scoring algorithms continuously mark the trajectory on the emotion plane. A graphic interface represents the tracking of dynamic emotional locus. The music emotion tracking process generates intended strength of music emotion and controls a 17 degree-of–freedom humanoid robot to perform the movement. An experiment using a humanoid robot is conducted to test how well the proposed system adds a target emotion to arbitrary movements at intended strength. This study conducted a questionnaire survey to test the design accuracy and take an objective result as research verification. The results of experiments suggest that the proposed method succeeded in adding a target emotion to arbitrary movements.en_US
dc.language.isozh_TWen_US
dc.subject拉邦動作分析zh_TW
dc.subject拉邦舞譜zh_TW
dc.subject音樂情緒辨識zh_TW
dc.subject情緒動作zh_TW
dc.subjectLaban Movement Analysisen_US
dc.subjectLabanotationen_US
dc.subjectMusic Mood Recognitionen_US
dc.subjectEmotional Movementen_US
dc.title基於拉邦動作分析之人型機器人音樂情緒動作展演及拉邦舞譜設計zh_TW
dc.titleMotion Rendering of Humanoid Robots Inspired by the Music Emotions Based on Laban Movement Analysis and Labanotation Designen_US
dc.typeThesisen_US
dc.contributor.department工學院聲音與音樂創意科技碩士學位學程zh_TW
顯示於類別:畢業論文