標題: 利用不同幾何階層中之不變性作三維推論之研究
Research on Invariant-based 3D Inference under Different Strata of Geometry
作者: 劉建興
Jain-Shing Liu
莊仁輝
Jen-Hui Chuang
資訊科學與工程研究所
關鍵字: 三維推論;幾何階層;影子產生;交比;誤差分析;相機自我校準;人臉辨識;3D inference;geometrical strata;shadow generation;cross ratio;error analysis;camera self-calibration;face recognition
公開日期: 2000
摘要: 在本論文中,我們提出了數種從二維影像作三維推論之方法。此三維推論問題在一般狀況下是不易解決的,然而,在某些特殊情況下則為可能。在本文中,我們將先研究如何在參考點已知的前提下以兩張影像產生影子。所提出之演算法僅利用交比(cross ratio)來產生陰影而不需作相機校準。緊接於此,我們提出一個以幾何為基礎之交比誤差分析。 然而,就更廣泛的應用而言,先建立物體之三維模型,再將模型投射到二維影像上是必須的。在文中,第三個主題即為相機之自我校準:此為使用絕對對偶二次式(absolute dual quadric)求得相機剛體運動進而重建三維之方法。然而,這種方法在僅提供兩張影像之前提下是不足的。針對於此,本論文提出如何在僅給兩張影像的狀況下,自我校準具有小角度或在某些限制下旋轉運動之相機。最後,藉由幾何上的關連仿射結構(relative affine structure),我們展示一個無須歐式三維重建之人臉辨識系統。
In this dissertation, we present some approaches to three dimensional (3D) inference from two dimensional (2D) images. Such a problem is not easy in general. However, it could be solved for some special cases. In this context, we investigate first that how a shadow can be obtained given only two images provided that some reference points can be identified in the images. The proposed shadow generating algorithm uses only cross-ratios and requires no camera calibration. Following that, a geometry-based error estimation for cross-ratios is introduced For more general applications, it may be necessary to establish 3D models first and then project these models into an image. In this context, the third topic is devoted to a self-calibration approach, which uses the absolute dual quadric to solve the rigid motion problem of a camera. However, given only two images, the self-calibration approach is ill-conditioned. For this problem, our contribution is to provide a linear solution for a camera having only small or some restricted rotations. Finally, by using a special geometric entity, namely the relative affine structure, we show a face identification system can be established without a Euclidean reconstruction.
URI: http://140.113.39.130/cdrfb3/record/nctu/#NT890394104
http://hdl.handle.net/11536/67011
Appears in Collections:Thesis