完整後設資料紀錄
DC 欄位語言
dc.contributor.authorJuang, Gwo-Haoen_US
dc.contributor.authorChen, Yan-Juen_US
dc.contributor.authorPeng, Jen-Yuen_US
dc.contributor.authorLin, I-Chenen_US
dc.contributor.authorChao, Jui-Hsiangen_US
dc.date.accessioned2014-12-08T15:02:55Z-
dc.date.available2014-12-08T15:02:55Z-
dc.date.issued2008en_US
dc.identifier.isbn978-80-86943-16-9en_US
dc.identifier.urihttp://hdl.handle.net/11536/1530-
dc.description.abstractInteractively controlling or editing motion capture data is an intriguing topic in games or animation prototyping. However, data-driven approaches require a large data set to synthesize motion of high variety. In the first part of this paper, we propose novel partial motion synthesis to extend the control parameter space or variety of limited motion data. We extend the attack range of kicking or punching by blending body parts respectively and then reassembling. Users can simply assign a target position in the extended parameter space and will get a motion that hit the desired target. In the second application, we propose applying partial motion assembly for motion editing. Given a sequence of key postures, our system will retrieve resembling partial figures from data sets. While reassembling the different parts of character motion and adjusting the motion variance according to the query motion, our novel synthetic motions can preserve original style and naturalness.en_US
dc.language.isoen_USen_US
dc.subjectmotion synthesisen_US
dc.subjectpartial motion blendingen_US
dc.subjectpartial motion assemblyen_US
dc.titlePartial Motion Blending and Assembly for Interactive Motion Synthesisen_US
dc.typeProceedings Paperen_US
dc.identifier.journalWSCG 2008, COMMUNICATION PAPERSen_US
dc.citation.spage129en_US
dc.citation.epage136en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000264421700018-
顯示於類別:會議論文