Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lin, Ke-Yu | en_US |
dc.contributor.author | Hang, Hsueh-Ming | en_US |
dc.date.accessioned | 2019-08-02T02:24:15Z | - |
dc.date.available | 2019-08-02T02:24:15Z | - |
dc.date.issued | 2018-01-01 | en_US |
dc.identifier.isbn | 978-9-8814-7685-2 | en_US |
dc.identifier.issn | 2309-9402 | en_US |
dc.identifier.uri | http://hdl.handle.net/11536/152428 | - |
dc.description.abstract | The quality of depth map is one key factor contributing to the quality of 3D video and virtual reality (VR) rendering. In this study, we use RGB-D camera (Microsoft Kinect for Windows v2) to capture the color sequences and depth sequences as our system inputs. The captured depth map contains various noises and artifacts in addition to the occlusion regions. We use the color sequences in both spatial domain and time domain to improve the quality of the depth map. Our main contributions are alignment between color and depth images and reducing artifacts in the reflection regions. Several techniques are adopted, modified, and re-designed such as moving object compensation, unreliable depth pixel detection, and locally adaptive depth pixel refinement algorithm. The experimental results show that the quality of the depth map is significantly improved. | en_US |
dc.language.iso | en_US | en_US |
dc.subject | Depth map | en_US |
dc.subject | depth refinement | en_US |
dc.subject | camera synchronization | en_US |
dc.subject | backward warping | en_US |
dc.subject | disocclusion filling | en_US |
dc.subject | Kinect v2 | en_US |
dc.title | DEPTH MAP ENHANCEMENT ON RGB-D VIDEO CAPTURED BY KINECT V2 | en_US |
dc.type | Proceedings Paper | en_US |
dc.identifier.journal | 2018 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC) | en_US |
dc.citation.spage | 1530 | en_US |
dc.citation.epage | 1535 | en_US |
dc.contributor.department | 交大名義發表 | zh_TW |
dc.contributor.department | National Chiao Tung University | en_US |
dc.identifier.wosnumber | WOS:000468383400249 | en_US |
dc.citation.woscount | 0 | en_US |
Appears in Collections: | Conferences Paper |