Full metadata record
DC FieldValueLanguage
dc.contributor.authorHuang, Chih-Fangen_US
dc.contributor.authorNien, Wei-Poen_US
dc.date.accessioned2014-12-08T15:33:18Z-
dc.date.available2014-12-08T15:33:18Z-
dc.date.issued2013en_US
dc.identifier.issn1550-1329en_US
dc.identifier.urihttp://hdl.handle.net/11536/23178-
dc.identifier.urihttp://dx.doi.org/10.1155/2013/645961en_US
dc.description.abstractAutomated music composition and algorithmic composition are based on the logic operation with music parameters being set according to the desired music style or emotion. The computer generative music can be integrated with other domains using proper mapping techniques, such as the intermedia arts with music technology. This paper mainly discusses the possibility of integrating both automatic composition and motion devices with an Emotional Identification System (EIS) using the emotion classification and parameters through wireless communication ZigBee. The correspondent music pattern and motion path can be driven simultaneously via the cursor movement of the Emotion Map (EM) varying with time. An interactive music-motion platform is established accordingly.en_US
dc.language.isoen_USen_US
dc.titleA Study of the Integrated Automated Emotion Music with the Motion Gesture Synthesis via ZigBee Wireless Communicationen_US
dc.typeArticleen_US
dc.identifier.doi10.1155/2013/645961en_US
dc.identifier.journalINTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKSen_US
dc.contributor.department機械工程學系zh_TW
dc.contributor.departmentDepartment of Mechanical Engineeringen_US
dc.identifier.wosnumberWOS:000327344000001-
dc.citation.woscount1-
Appears in Collections:Articles


Files in This Item:

  1. 000327344000001.pdf

If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.