標題: | Physically based cosmetic rendering |
作者: | Huang, Cheng-Guo Huang, Tsung-Shian Lin, Wen-Chieh Chuang, Jung-Hong 資訊工程學系 Department of Computer Science |
關鍵字: | skin rendering;translucent rendering;cosmetic rendering |
公開日期: | 1-May-2013 |
摘要: | Simulating realistic makeup effects is one of the important research issues in the 3D facial animation and cosmetic industry. Existing approaches based on image processing techniques, such as warping and blending, have been mostly applied to transfer one's makeup to another's. Although these approaches are intuitive and need only makeup images, they have some drawbacks, for example, distorted shapes and fixed viewing and lighting conditions. In this paper, we propose an integrated approach, which combines the KubelkaMunk model and a screen-space skin rendering approach, to simulate 3D makeup effects. The KubelkaMunk model is used to compute total transmittance when light passes through cosmetic layers, whereas the screen-space translucent rendering approach simulates the subsurface scattering effects inside human skin. The parameters of KubelkaMunk model are obtained by measuring the optical properties of different cosmetic materials, such as foundations, blushes, and lipsticks. Our results demonstrate that the proposed approach is able to render realistic cosmetic effects on human facial models, and different cosmetic materials and styles can be flexibly applied and simulated in real time. Copyright (c) 2013 John Wiley & Sons, Ltd. |
URI: | http://dx.doi.org/10.1002/cav.1523 http://hdl.handle.net/11536/21897 |
ISSN: | 1546-4261 |
DOI: | 10.1002/cav.1523 |
期刊: | COMPUTER ANIMATION AND VIRTUAL WORLDS |
Volume: | 24 |
Issue: | 3-4 |
起始頁: | 275 |
結束頁: | 283 |
Appears in Collections: | Articles |
Files in This Item:
If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.