Full metadata record
DC FieldValueLanguage
dc.contributor.authorTeo, Tee-Annen_US
dc.contributor.authorKao, Chung-Hsuanen_US
dc.date.accessioned2015-12-02T03:00:55Z-
dc.date.available2015-12-02T03:00:55Z-
dc.date.issued2012-01-01en_US
dc.identifier.issn2194-9034en_US
dc.identifier.urihttp://hdl.handle.net/11536/128547-
dc.description.abstractThis research integrates existing LOD 2 building models and multiple close-range images for facade structural lines extraction. The major works are orientation determination and multiple image matching. In the orientation determination, Speeded Up Robust Features (SURF) is applied to extract tie points automatically. Then, tie points and control points are combined for block adjustment. An object-based multi-images matching is proposed to extract the facade structural lines. The 2D lines in image space are extracted by Canny operator followed by Hough transform. The role of LOD 2 building models is to correct the tilt displacement of image from different views. The wall of LOD 2 model is also used to generate hypothesis planes for similarity measurement. Finally, average normalized cross correlation is calculated to obtain the best location in object space. The test images are acquired by a non-metric camera Nikon D2X. The total number of image is 33. The experimental results indicate that the accuracy of orientation determination is about 1 pixel from 2515 tie points and 4 control points. It also indicates that line-based matching is more flexible than point-based matching.en_US
dc.language.isoen_USen_US
dc.subjectbuildingen_US
dc.subjectfacadeen_US
dc.subjectlinear featureen_US
dc.subjectmultiple images matchingen_US
dc.titleLINE-BASED MULTI-IMAGE MATCHING FOR FACADE RECONSTRUCTIONen_US
dc.typeProceedings Paperen_US
dc.identifier.journalXXII ISPRS CONGRESS, TECHNICAL COMMISSION IIIen_US
dc.citation.volume39-B3en_US
dc.citation.spage63en_US
dc.citation.epage68en_US
dc.contributor.department土木工程學系zh_TW
dc.contributor.departmentDepartment of Civil Engineeringen_US
dc.identifier.wosnumberWOS:000358211200012en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper