Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Huang, Chih-Fang | en_US |
dc.contributor.author | Nien, Wei-Po | en_US |
dc.date.accessioned | 2014-12-08T15:33:18Z | - |
dc.date.available | 2014-12-08T15:33:18Z | - |
dc.date.issued | 2013 | en_US |
dc.identifier.issn | 1550-1329 | en_US |
dc.identifier.uri | http://hdl.handle.net/11536/23178 | - |
dc.identifier.uri | http://dx.doi.org/10.1155/2013/645961 | en_US |
dc.description.abstract | Automated music composition and algorithmic composition are based on the logic operation with music parameters being set according to the desired music style or emotion. The computer generative music can be integrated with other domains using proper mapping techniques, such as the intermedia arts with music technology. This paper mainly discusses the possibility of integrating both automatic composition and motion devices with an Emotional Identification System (EIS) using the emotion classification and parameters through wireless communication ZigBee. The correspondent music pattern and motion path can be driven simultaneously via the cursor movement of the Emotion Map (EM) varying with time. An interactive music-motion platform is established accordingly. | en_US |
dc.language.iso | en_US | en_US |
dc.title | A Study of the Integrated Automated Emotion Music with the Motion Gesture Synthesis via ZigBee Wireless Communication | en_US |
dc.type | Article | en_US |
dc.identifier.doi | 10.1155/2013/645961 | en_US |
dc.identifier.journal | INTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKS | en_US |
dc.contributor.department | 機械工程學系 | zh_TW |
dc.contributor.department | Department of Mechanical Engineering | en_US |
dc.identifier.wosnumber | WOS:000327344000001 | - |
dc.citation.woscount | 1 | - |
Appears in Collections: | Articles |
Files in This Item:
If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.