標題: 人類臉部彩妝即時顯像技術之研究
Real-Time Rendering of Digital Facial Makeup
作者: 莊榮宏
CHUANG JUNG-HONG
國立交通大學資訊工程學系(所)
關鍵字: 皮膚半透明散射;彩妝模擬;彩妝顯像;彩妝材質量測;Human skin translucent rendering;cosmetic simulation;cosmetic rendering;cosmetic material measurement
公開日期: 2012
摘要: 模擬逼真的彩妝效果在3D臉部動畫以及化妝品產業中都是一個相當重要的問題。目前的方法使用影像扭曲技術,把化妝效果從一個人臉影像轉移到另外一個人臉影像上。然而,這些方法有形狀變形失真的問題,且要求輸入的影像必須要有相似的拍攝角度、臉型以及打光才能夠使用,且沒有考慮皮膚與彩妝整合之顯像效果。在此計畫中我們將提出一個整合人類皮膚顯像以及彩妝模擬的即時顯像方法。人類皮膚顯像將會使用螢幕空間的半透明材質顯像技術,而彩妝模擬將會使用Kubelka-Munk模型及發展更複雜更適合的多層次顯像模型來表示。此外,我們還需設計如何量測化妝品,包含粉底、腮紅、唇膏等,的材質資訊,再從量測的材質光譜資料推導出符合Kubelka-Munk模型或其他多層次半透明散射模型的顯像相關參數。我們也將發展一彩妝設計系統,讓使用者方便的作彩妝設計。
Rendering realistic facial makeup effect is an important issue in the 3D facial animation and cosmetic industry. Existing methods use image warping technology to transfer the makeup effect from one person to another one. These methods usually suffer from the shape distortion and require similarity viewing parameters, similar face shape and lighting condition. In this project, we will develop a system that integrates human skin rendering and the cosmetic rendering to render 3D makeup effects in real-time. The subsurface scattering of human skin will be rendered using a screen-based approach. The light penetration through multi-layered cosmetics will be formulated by using the Kubelka-Munk model. In addition, we need to measure the material properties of several cosmetics, including foundations, blushes, lipsticks and so on, and based on these properties we can derive the parameters for the Kubelka-Munk model. We will also develop as interface system that allows users to design the makeup.
官方說明文件#: NSC101-2221-E009-156-MY2
URI: http://hdl.handle.net/11536/98502
https://www.grb.gov.tw/search/planDetail?id=2631612&docId=395254
顯示於類別:研究計畫