完整後設資料紀錄
DC 欄位語言
dc.contributor.authorHuang, Chih-Fangen_US
dc.contributor.authorNien, Wei-Poen_US
dc.date.accessioned2014-12-08T15:33:18Z-
dc.date.available2014-12-08T15:33:18Z-
dc.date.issued2013en_US
dc.identifier.issn1550-1329en_US
dc.identifier.urihttp://hdl.handle.net/11536/23178-
dc.identifier.urihttp://dx.doi.org/10.1155/2013/645961en_US
dc.description.abstractAutomated music composition and algorithmic composition are based on the logic operation with music parameters being set according to the desired music style or emotion. The computer generative music can be integrated with other domains using proper mapping techniques, such as the intermedia arts with music technology. This paper mainly discusses the possibility of integrating both automatic composition and motion devices with an Emotional Identification System (EIS) using the emotion classification and parameters through wireless communication ZigBee. The correspondent music pattern and motion path can be driven simultaneously via the cursor movement of the Emotion Map (EM) varying with time. An interactive music-motion platform is established accordingly.en_US
dc.language.isoen_USen_US
dc.titleA Study of the Integrated Automated Emotion Music with the Motion Gesture Synthesis via ZigBee Wireless Communicationen_US
dc.typeArticleen_US
dc.identifier.doi10.1155/2013/645961en_US
dc.identifier.journalINTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKSen_US
dc.contributor.department機械工程學系zh_TW
dc.contributor.departmentDepartment of Mechanical Engineeringen_US
dc.identifier.wosnumberWOS:000327344000001-
dc.citation.woscount1-
顯示於類別:期刊論文


文件中的檔案:

  1. 000327344000001.pdf

若為 zip 檔案,請下載檔案解壓縮後,用瀏覽器開啟資料夾中的 index.html 瀏覽全文。