完整後設資料紀錄
DC 欄位語言
dc.contributor.authorCheng, Hsiu-Wenen_US
dc.contributor.authorChen, Tsung-Linen_US
dc.contributor.authorTien, Chung-Haoen_US
dc.date.accessioned2019-05-02T00:25:51Z-
dc.date.available2019-05-02T00:25:51Z-
dc.date.issued2019-03-20en_US
dc.identifier.issn1424-8220en_US
dc.identifier.urihttp://dx.doi.org/10.3390/s19061380en_US
dc.identifier.urihttp://hdl.handle.net/11536/151594-
dc.description.abstractThe capability of landing on previously unvisited areas is a fundamental challenge for an unmanned aerial vehicle (UAV). In this paper, we developed a vision-based motion estimation as an aid to improve landing performance. As an alternative to the common scenarios accompanying by external infrastructures or well-defined marker, the proposed hybrid framework can successfully land on a new area without any prior information about guiding marks. The implementation was based on the optical flow technique associated with a multi-scale strategy to overcome the decreasing field-of-view during the UAV descending. Compared with a commercial Global Positioning System (GPS) through a sequence of flight trials, the vision-aided scheme can effectively minimize the possible sensing error, thus, leading to a more accurate result. Moreover, this work has potential to integrate the fast-growing image learning process and yields more practical versatility for UAV applications in the future.en_US
dc.language.isoen_USen_US
dc.subjectoptical flowen_US
dc.subjectunmanned aerial vehicle (UAV)en_US
dc.subjectvision-based motion estimationen_US
dc.titleMotion Estimation by Hybrid Optical Flow Technology for UAV Landing in an Unvisited Areaen_US
dc.typeArticleen_US
dc.identifier.doi10.3390/s19061380en_US
dc.identifier.journalSENSORSen_US
dc.citation.volume19en_US
dc.citation.issue6en_US
dc.citation.spage0en_US
dc.citation.epage0en_US
dc.contributor.department機械工程學系zh_TW
dc.contributor.department光電工程學系zh_TW
dc.contributor.departmentDepartment of Mechanical Engineeringen_US
dc.contributor.departmentDepartment of Photonicsen_US
dc.identifier.wosnumberWOS:000464522700005en_US
dc.citation.woscount0en_US
顯示於類別:期刊論文